Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Supervised Learning
Understanding
Bagging and
Boosting
Both are ensemble techniques,
where a set of weak learners are combined to create a strong learner
that obtains better performance than a single one.
Error = Bias + Variance
+ Noise
Bagging short for Bootstrap Aggregating
It’s a way to increase accuracy by Decreasing Variance
Done by
Generating additional dataset using combinations
with repetitions to produce multisets of same
cardinality/size as original dataset.
Example: Random Forest
Develops fully grown decision
trees (low bias high variance)
which are uncorrelated to
maximize the decrease in
variance.
Since cannot reduce bias
therefore req. large unpruned
trees.
Boosting
It’s a way to increase accuracy by Reducing Bias
2- step Process Done by
Develop averagely performing models over subsets of
the original data.
Boost these model performance by combining them
using a cost function (eg.majority vote).
Note: every subsets contains elements that were
misclassified or were close by the previous model.
Example: Gradient Boosted Tree
Develops shallow decision trees (high
bias low variance) aka weak larner.
Reduce error mainly by reducing bias
developing new learner taking into
account the previous learner
(Sequential).
Understanding Graphically
Understanding Bagging and Boosting
Understanding Bagging and Boosting
Understanding Bagging and Boosting
Understanding Bagging and Boosting
Comparison
Both are ensemble methods to get N learners
from 1 learner…
… but, while they are built independently for
Bagging, Boosting tries to add new models that do
well where previous models fail.
Both generate several training data sets by
random sampling…
… but only Boosting determines weights for the data
to tip the scales in favor of the most difficult cases.
Both make the final decision by averaging the N
learners (or taking the majority of them)…
… but it is an equally weighted average for Bagging
and a weighted average for Boosting, more weight
to those with better performance on training data.
Both are good at reducing variance and provide
higher stability…
… but only Boosting tries to reduce bias. On the other
hand, Bagging may solve the overfitting problem,
while Boosting can increase it.
Similarities Differences
Exploring the Scope of Supervised
Learning in Current Setup
Areas where Supervised Learning can be useful
Feature Selection for Clustering
Evaluating Features
Increasing the Aggressiveness of the Current setup
Bringing New Rules Idea
Feature
Selection/
Feature
Importance &
Model
Accuracy and
Threshold
Evaluation
Algorithm Used Feature Importance Metric
XGBoost F Score
Random Forest Gini Index, Entropy
Feature
Selection/
Importance
XGBoost - F Score
Feature
Selection/
Importance
RF - Gini Index
Feature
Selection/
Importance
RF - Entropy
Feature Selection/ Importance
Comparison b/w Important Feature by Random Forest & XGBoost
feature_21w
feature_sut
feature_du1
feature_sc3
feature_drh
feature_1a2
feature_sc18
feature_drl
feature_snc
feature_sc1
feature_2c3
feature_npb
feature_3e1
feature_bst
feature_nub
RF - Entropy
feature_sut
feature_sc3
feature_21w
feature_sc18
feature_du1
feature_sc1
feature_drh
feature_drl
feature_1a2
feature_snc
feature_npb
feature_3e1
feature_tbu
feature_nub
feature_bst
RF - GiniXGBoost - F Score
feature_1a2
feature_2c3
feature_hhs
feature_nrp
feature_urh
feature_nub
feature_nup
feature_psc
feature_sncp
feature_3e1
feature_tpa
feature_snc
feature_bst
feature_tbu
feature_nub
Analysis of Top 15 important variable
Feature Selection/ Importance
Comparison b/w Important Feature by Random Forest & XGBoost
Reason for difference in Feature Importance b/w XGB & RF
Basically, when there are several correlated features, boosting will tend to choose one and use it in
several trees (if necessary). Other correlated features won t be used a lot (or not at all).
It makes sense as other correlated features can't help in the split process anymore -> they don't bring
new information regarding the already used feature. And the learning is done in a serial way.
Each tree of a Random forest is not built from the same features (there is a random selection of
features to use for each tree). Each correlated feature may have the chance to be selected in one of the
tree. Therefore, when you look at the whole model it has used all features. The learning is done in
parallel so each tree is not aware of what have been used for other trees.
Tree Growth XGB
When you grow too many trees, trees are starting to be look very similar (when there is no loss
remaining to learn). Therefore the dominant feature will be an even more important. Having shallow
trees reinforce this trend because there are few possible important features at the root of a tree (shared
features between trees are most of the time the one at the root of it). So your results are not surprising.
In this case, you may have interesting results with random selection of columns (rate around 0.8).
Decreasing ETA may also help (keep more loss to explain after each iteration).
Model Accuracy and Threshold Evaluation
XGBoost
Model Accuracy and Threshold Evaluation
XGBoost
A A
A A BB
B B
Model Accuracy and Threshold Evaluation
XGBoost
A A
A A BB
B B
Model Accuracy and Threshold Evaluation
XGBoost
A A
A A BB
B B
Model Accuracy and Threshold Evaluation
XGBoost
A A
A A BB
B B
Model Accuracy and Threshold Evaluation
XGBoost
A A
A A BB
B B
Model Accuracy and Threshold Evaluation
XGBoost
Threshold Accuracy TN FP FN TP
0 0.059% 0 46990 0 2936
0.1 87.353% 42229 4761 1553 1383
0.2 93.881% 46075 915 2140 796
0.3 94.722% 46691 299 2336 600
0.4 94.894% 46866 124 2425 511
0.5 94.902% 46923 67 2478 458
0.6 94.866% 46956 34 2529 407
0.7 94.856% 46973 17 2551 385
0.8 94.824% 46977 13 2571 365
0.9 94.776% 46982 8 2600 336
1 94.119% 46990 0 2936 0
A
A B
B
Model Accuracy and Threshold Evaluation
Random Forest Criteria - Gini Index Random Forest Criteria - Entropy
Criteria Accuracy TN FP FN TP
Gini 94.800% 46968 22 2574 362
Entropy 94.788% 46967 23 2579 357
A A
A A BB
B B
Model Accuracy and Threshold Evaluation
Comparison b/w Random Forest & XGBoost
Criteria Accuracy TN FP FN TP
Gini 94.800% 46968 22 2574 362
Entropy 94.788% 46967 23 2579 357
Threshold Accuracy TN FP FN TP
0 0.059% 0 46990 0 2936
0.1 87.353% 42229 4761 1553 1383
0.2 93.881% 46075 915 2140 796
0.3 94.722% 46691 299 2336 600
0.4 94.894% 46866 124 2425 511
0.5 94.902% 46923 67 2478 458
0.6 94.866% 46956 34 2529 407
0.7 94.856% 46973 17 2551 385
0.8 94.824% 46977 13 2571 365
0.9 94.776% 46982 8 2600 336
1 94.119% 46990 0 2936 0
Bringing New Rules Idea
Comparison b/w Random Forest & XGBoost
Bringing New Rules Idea
Comparison b/w Random Forest & XGBoost
Understanding Bagging and Boosting

More Related Content

What's hot

Random forest
Random forestRandom forest
Random forest
Ujjawal
 
Ensemble methods in machine learning
Ensemble methods in machine learningEnsemble methods in machine learning
Ensemble methods in machine learning
SANTHOSH RAJA M G
 
Naive Bayes Presentation
Naive Bayes PresentationNaive Bayes Presentation
Naive Bayes Presentation
Md. Enamul Haque Chowdhury
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
CloudxLab
 
Machine Learning With Logistic Regression
Machine Learning  With Logistic RegressionMachine Learning  With Logistic Regression
Machine Learning With Logistic Regression
Knoldus Inc.
 
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Simplilearn
 
Machine Learning - Ensemble Methods
Machine Learning - Ensemble MethodsMachine Learning - Ensemble Methods
Machine Learning - Ensemble Methods
Andrew Ferlitsch
 
Machine learning with ADA Boost
Machine learning with ADA BoostMachine learning with ADA Boost
Machine learning with ADA Boost
Aman Patel
 
K-Folds Cross Validation Method
K-Folds Cross Validation MethodK-Folds Cross Validation Method
K-Folds Cross Validation Method
SHUBHAM GUPTA
 
Machine Learning with Decision trees
Machine Learning with Decision treesMachine Learning with Decision trees
Machine Learning with Decision trees
Knoldus Inc.
 
Machine Learning and Real-World Applications
Machine Learning and Real-World ApplicationsMachine Learning and Real-World Applications
Machine Learning and Real-World Applications
MachinePulse
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning
Mohammad Junaid Khan
 
Ensemble Learning and Random Forests
Ensemble Learning and Random ForestsEnsemble Learning and Random Forests
Ensemble Learning and Random Forests
CloudxLab
 
Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests
Derek Kane
 
Understanding random forests
Understanding random forestsUnderstanding random forests
Understanding random forests
Marc Garcia
 
Ensemble methods
Ensemble methodsEnsemble methods
Ensemble methods
Christopher Marker
 
Feature Engineering in Machine Learning
Feature Engineering in Machine LearningFeature Engineering in Machine Learning
Feature Engineering in Machine Learning
Knoldus Inc.
 
Data Preprocessing
Data PreprocessingData Preprocessing
Decision tree
Decision treeDecision tree
Decision tree
Ami_Surati
 
Gradient descent method
Gradient descent methodGradient descent method
Gradient descent method
Sanghyuk Chun
 

What's hot (20)

Random forest
Random forestRandom forest
Random forest
 
Ensemble methods in machine learning
Ensemble methods in machine learningEnsemble methods in machine learning
Ensemble methods in machine learning
 
Naive Bayes Presentation
Naive Bayes PresentationNaive Bayes Presentation
Naive Bayes Presentation
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
 
Machine Learning With Logistic Regression
Machine Learning  With Logistic RegressionMachine Learning  With Logistic Regression
Machine Learning With Logistic Regression
 
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
 
Machine Learning - Ensemble Methods
Machine Learning - Ensemble MethodsMachine Learning - Ensemble Methods
Machine Learning - Ensemble Methods
 
Machine learning with ADA Boost
Machine learning with ADA BoostMachine learning with ADA Boost
Machine learning with ADA Boost
 
K-Folds Cross Validation Method
K-Folds Cross Validation MethodK-Folds Cross Validation Method
K-Folds Cross Validation Method
 
Machine Learning with Decision trees
Machine Learning with Decision treesMachine Learning with Decision trees
Machine Learning with Decision trees
 
Machine Learning and Real-World Applications
Machine Learning and Real-World ApplicationsMachine Learning and Real-World Applications
Machine Learning and Real-World Applications
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning
 
Ensemble Learning and Random Forests
Ensemble Learning and Random ForestsEnsemble Learning and Random Forests
Ensemble Learning and Random Forests
 
Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests
 
Understanding random forests
Understanding random forestsUnderstanding random forests
Understanding random forests
 
Ensemble methods
Ensemble methodsEnsemble methods
Ensemble methods
 
Feature Engineering in Machine Learning
Feature Engineering in Machine LearningFeature Engineering in Machine Learning
Feature Engineering in Machine Learning
 
Data Preprocessing
Data PreprocessingData Preprocessing
Data Preprocessing
 
Decision tree
Decision treeDecision tree
Decision tree
 
Gradient descent method
Gradient descent methodGradient descent method
Gradient descent method
 

Similar to Understanding Bagging and Boosting

Random Forest Classifier in Machine Learning | Palin Analytics
Random Forest Classifier in Machine Learning | Palin AnalyticsRandom Forest Classifier in Machine Learning | Palin Analytics
Random Forest Classifier in Machine Learning | Palin Analytics
Palin analytics
 
Working with the data for Machine Learning
Working with the data for Machine LearningWorking with the data for Machine Learning
Working with the data for Machine Learning
Mehwish690898
 
Introduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptxIntroduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptx
agathaljjwm20
 
CS109a_Lecture16_Bagging_RF_Boosting.pptx
CS109a_Lecture16_Bagging_RF_Boosting.pptxCS109a_Lecture16_Bagging_RF_Boosting.pptx
CS109a_Lecture16_Bagging_RF_Boosting.pptx
AbhishekSingh43430
 
Machine learning session6(decision trees random forrest)
Machine learning   session6(decision trees random forrest)Machine learning   session6(decision trees random forrest)
Machine learning session6(decision trees random forrest)
Abhimanyu Dwivedi
 
Tree net and_randomforests_2009
Tree net and_randomforests_2009Tree net and_randomforests_2009
Tree net and_randomforests_2009
Matthew Magistrado
 
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdfMachine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
AdityaSoraut
 
Decision tree
Decision treeDecision tree
Decision tree
SEMINARGROOT
 
13 random forest
13 random forest13 random forest
13 random forest
Vishal Dutt
 
Random Forest / Bootstrap Aggregation
Random Forest / Bootstrap AggregationRandom Forest / Bootstrap Aggregation
Random Forest / Bootstrap Aggregation
Rupak Roy
 
20211229120253D6323_PERT 06_ Ensemble Learning.pptx
20211229120253D6323_PERT 06_ Ensemble Learning.pptx20211229120253D6323_PERT 06_ Ensemble Learning.pptx
20211229120253D6323_PERT 06_ Ensemble Learning.pptx
RaflyRizky2
 
Bank loan purchase modeling
Bank loan purchase modelingBank loan purchase modeling
Bank loan purchase modeling
Saleesh Satheeshchandran
 
Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018
digitalzombie
 
Boosting Algorithms Omar Odibat
Boosting Algorithms Omar Odibat Boosting Algorithms Omar Odibat
Boosting Algorithms Omar Odibat
omarodibat
 
BaggingBoosting.pdf
BaggingBoosting.pdfBaggingBoosting.pdf
BaggingBoosting.pdf
DynamicPitch
 
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Parth Khare
 
An Introduction to Random Forest and linear regression algorithms
An Introduction to Random Forest and linear regression algorithmsAn Introduction to Random Forest and linear regression algorithms
An Introduction to Random Forest and linear regression algorithms
Shouvic Banik0139
 
Decision tree
Decision tree Decision tree
Decision tree
Learnbay Datascience
 
Download It
Download ItDownload It
Download It
butest
 
Handling Imbalanced Data: SMOTE vs. Random Undersampling
Handling Imbalanced Data: SMOTE vs. Random UndersamplingHandling Imbalanced Data: SMOTE vs. Random Undersampling
Handling Imbalanced Data: SMOTE vs. Random Undersampling
IRJET Journal
 

Similar to Understanding Bagging and Boosting (20)

Random Forest Classifier in Machine Learning | Palin Analytics
Random Forest Classifier in Machine Learning | Palin AnalyticsRandom Forest Classifier in Machine Learning | Palin Analytics
Random Forest Classifier in Machine Learning | Palin Analytics
 
Working with the data for Machine Learning
Working with the data for Machine LearningWorking with the data for Machine Learning
Working with the data for Machine Learning
 
Introduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptxIntroduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptx
 
CS109a_Lecture16_Bagging_RF_Boosting.pptx
CS109a_Lecture16_Bagging_RF_Boosting.pptxCS109a_Lecture16_Bagging_RF_Boosting.pptx
CS109a_Lecture16_Bagging_RF_Boosting.pptx
 
Machine learning session6(decision trees random forrest)
Machine learning   session6(decision trees random forrest)Machine learning   session6(decision trees random forrest)
Machine learning session6(decision trees random forrest)
 
Tree net and_randomforests_2009
Tree net and_randomforests_2009Tree net and_randomforests_2009
Tree net and_randomforests_2009
 
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdfMachine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
 
Decision tree
Decision treeDecision tree
Decision tree
 
13 random forest
13 random forest13 random forest
13 random forest
 
Random Forest / Bootstrap Aggregation
Random Forest / Bootstrap AggregationRandom Forest / Bootstrap Aggregation
Random Forest / Bootstrap Aggregation
 
20211229120253D6323_PERT 06_ Ensemble Learning.pptx
20211229120253D6323_PERT 06_ Ensemble Learning.pptx20211229120253D6323_PERT 06_ Ensemble Learning.pptx
20211229120253D6323_PERT 06_ Ensemble Learning.pptx
 
Bank loan purchase modeling
Bank loan purchase modelingBank loan purchase modeling
Bank loan purchase modeling
 
Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018
 
Boosting Algorithms Omar Odibat
Boosting Algorithms Omar Odibat Boosting Algorithms Omar Odibat
Boosting Algorithms Omar Odibat
 
BaggingBoosting.pdf
BaggingBoosting.pdfBaggingBoosting.pdf
BaggingBoosting.pdf
 
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
 
An Introduction to Random Forest and linear regression algorithms
An Introduction to Random Forest and linear regression algorithmsAn Introduction to Random Forest and linear regression algorithms
An Introduction to Random Forest and linear regression algorithms
 
Decision tree
Decision tree Decision tree
Decision tree
 
Download It
Download ItDownload It
Download It
 
Handling Imbalanced Data: SMOTE vs. Random Undersampling
Handling Imbalanced Data: SMOTE vs. Random UndersamplingHandling Imbalanced Data: SMOTE vs. Random Undersampling
Handling Imbalanced Data: SMOTE vs. Random Undersampling
 

More from Mohit Rajput

Understanding Association Rule Mining
Understanding Association Rule MiningUnderstanding Association Rule Mining
Understanding Association Rule Mining
Mohit Rajput
 
Understanding known _ unknown - known _ unknown
Understanding known _ unknown - known _ unknownUnderstanding known _ unknown - known _ unknown
Understanding known _ unknown - known _ unknown
Mohit Rajput
 
Algorithms in Reinforcement Learning
Algorithms in Reinforcement LearningAlgorithms in Reinforcement Learning
Algorithms in Reinforcement Learning
Mohit Rajput
 
Dissertation mid evaluation
Dissertation mid evaluationDissertation mid evaluation
Dissertation mid evaluation
Mohit Rajput
 
For Seminar - Prospect: Development of continuous CNT path in BCP using sel...
For Seminar - Prospect:  Development of continuous CNT path in BCP using  sel...For Seminar - Prospect:  Development of continuous CNT path in BCP using  sel...
For Seminar - Prospect: Development of continuous CNT path in BCP using sel...
Mohit Rajput
 
Mid-Dissertation Work Done Report
Mid-Dissertation Work Done ReportMid-Dissertation Work Done Report
Mid-Dissertation Work Done Report
Mohit Rajput
 
Mid-Dissertation Work Report Presentation
Mid-Dissertation Work Report Presentation  Mid-Dissertation Work Report Presentation
Mid-Dissertation Work Report Presentation
Mohit Rajput
 
Sura ppt final
Sura ppt finalSura ppt final
Sura ppt final
Mohit Rajput
 
SURA Final report PVDF-CNT
SURA Final report PVDF-CNTSURA Final report PVDF-CNT
SURA Final report PVDF-CNT
Mohit Rajput
 
R markup code to create Regression Model
R markup code to create Regression ModelR markup code to create Regression Model
R markup code to create Regression Model
Mohit Rajput
 
Regression Model for movies
Regression Model for moviesRegression Model for movies
Regression Model for movies
Mohit Rajput
 
Presentation- BCP self assembly meshes
Presentation- BCP self assembly meshesPresentation- BCP self assembly meshes
Presentation- BCP self assembly meshes
Mohit Rajput
 
Presentation- Multilayer block copolymer meshes by orthogonal self-assembly
Presentation- Multilayer block copolymer  meshes by orthogonal self-assemblyPresentation- Multilayer block copolymer  meshes by orthogonal self-assembly
Presentation- Multilayer block copolymer meshes by orthogonal self-assembly
Mohit Rajput
 
Cover for report on Biofuels Generation
Cover for report on Biofuels GenerationCover for report on Biofuels Generation
Cover for report on Biofuels Generation
Mohit Rajput
 
A Report on Metal Drawing Operations
A Report on Metal Drawing OperationsA Report on Metal Drawing Operations
A Report on Metal Drawing Operations
Mohit Rajput
 
A technical report on BioFuels Generation
A technical report on BioFuels GenerationA technical report on BioFuels Generation
A technical report on BioFuels Generation
Mohit Rajput
 
Presentation - Bio-fuels Generation
Presentation - Bio-fuels GenerationPresentation - Bio-fuels Generation
Presentation - Bio-fuels Generation
Mohit Rajput
 
Status of Education in India by Mohit Rajput
Status of Education in India by Mohit RajputStatus of Education in India by Mohit Rajput
Status of Education in India by Mohit Rajput
Mohit Rajput
 
Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...
Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...
Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...
Mohit Rajput
 
Posters for Exhibition
Posters for ExhibitionPosters for Exhibition
Posters for Exhibition
Mohit Rajput
 

More from Mohit Rajput (20)

Understanding Association Rule Mining
Understanding Association Rule MiningUnderstanding Association Rule Mining
Understanding Association Rule Mining
 
Understanding known _ unknown - known _ unknown
Understanding known _ unknown - known _ unknownUnderstanding known _ unknown - known _ unknown
Understanding known _ unknown - known _ unknown
 
Algorithms in Reinforcement Learning
Algorithms in Reinforcement LearningAlgorithms in Reinforcement Learning
Algorithms in Reinforcement Learning
 
Dissertation mid evaluation
Dissertation mid evaluationDissertation mid evaluation
Dissertation mid evaluation
 
For Seminar - Prospect: Development of continuous CNT path in BCP using sel...
For Seminar - Prospect:  Development of continuous CNT path in BCP using  sel...For Seminar - Prospect:  Development of continuous CNT path in BCP using  sel...
For Seminar - Prospect: Development of continuous CNT path in BCP using sel...
 
Mid-Dissertation Work Done Report
Mid-Dissertation Work Done ReportMid-Dissertation Work Done Report
Mid-Dissertation Work Done Report
 
Mid-Dissertation Work Report Presentation
Mid-Dissertation Work Report Presentation  Mid-Dissertation Work Report Presentation
Mid-Dissertation Work Report Presentation
 
Sura ppt final
Sura ppt finalSura ppt final
Sura ppt final
 
SURA Final report PVDF-CNT
SURA Final report PVDF-CNTSURA Final report PVDF-CNT
SURA Final report PVDF-CNT
 
R markup code to create Regression Model
R markup code to create Regression ModelR markup code to create Regression Model
R markup code to create Regression Model
 
Regression Model for movies
Regression Model for moviesRegression Model for movies
Regression Model for movies
 
Presentation- BCP self assembly meshes
Presentation- BCP self assembly meshesPresentation- BCP self assembly meshes
Presentation- BCP self assembly meshes
 
Presentation- Multilayer block copolymer meshes by orthogonal self-assembly
Presentation- Multilayer block copolymer  meshes by orthogonal self-assemblyPresentation- Multilayer block copolymer  meshes by orthogonal self-assembly
Presentation- Multilayer block copolymer meshes by orthogonal self-assembly
 
Cover for report on Biofuels Generation
Cover for report on Biofuels GenerationCover for report on Biofuels Generation
Cover for report on Biofuels Generation
 
A Report on Metal Drawing Operations
A Report on Metal Drawing OperationsA Report on Metal Drawing Operations
A Report on Metal Drawing Operations
 
A technical report on BioFuels Generation
A technical report on BioFuels GenerationA technical report on BioFuels Generation
A technical report on BioFuels Generation
 
Presentation - Bio-fuels Generation
Presentation - Bio-fuels GenerationPresentation - Bio-fuels Generation
Presentation - Bio-fuels Generation
 
Status of Education in India by Mohit Rajput
Status of Education in India by Mohit RajputStatus of Education in India by Mohit Rajput
Status of Education in India by Mohit Rajput
 
Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...
Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...
Internship Presentation on Characterization of Stainless Steel-Titanium Diffu...
 
Posters for Exhibition
Posters for ExhibitionPosters for Exhibition
Posters for Exhibition
 

Recently uploaded

Acid Base Practice Test 4- KEY.pdfkkjkjk
Acid Base Practice Test 4- KEY.pdfkkjkjkAcid Base Practice Test 4- KEY.pdfkkjkjk
Acid Base Practice Test 4- KEY.pdfkkjkjk
talha2khan2k
 
Databricks Vs Snowflake off Page PDF submission.pptx
Databricks Vs Snowflake off Page PDF submission.pptxDatabricks Vs Snowflake off Page PDF submission.pptx
Databricks Vs Snowflake off Page PDF submission.pptx
dewsharon760
 
ChessMaster Project Presentation for Batch 1643.pptx
ChessMaster Project Presentation for Batch 1643.pptxChessMaster Project Presentation for Batch 1643.pptx
ChessMaster Project Presentation for Batch 1643.pptx
duduphc
 
Module-4_Docker_Training Course outline_
Module-4_Docker_Training Course outline_Module-4_Docker_Training Course outline_
Module-4_Docker_Training Course outline_
AmanTiwari297384
 
Why You Need Real-Time Data to Compete in E-Commerce
Why You Need  Real-Time Data to Compete in  E-CommerceWhy You Need  Real-Time Data to Compete in  E-Commerce
Why You Need Real-Time Data to Compete in E-Commerce
PromptCloud
 
Graph Machine Learning - Past, Present, and Future -
Graph Machine Learning - Past, Present, and Future -Graph Machine Learning - Past, Present, and Future -
Graph Machine Learning - Past, Present, and Future -
kashipong
 
Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...
Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...
Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...
Larry Smarr
 
一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理
一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理
一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理
da42ki0
 
Toward a National Research Platform to Enable Data-Intensive Computing
Toward a National Research Platform to Enable Data-Intensive ComputingToward a National Research Platform to Enable Data-Intensive Computing
Toward a National Research Platform to Enable Data-Intensive Computing
Larry Smarr
 
Data Storytelling Final Project for MBA 635
Data Storytelling Final Project for MBA 635Data Storytelling Final Project for MBA 635
Data Storytelling Final Project for MBA 635
HeidiLivengood
 
BGTUG Meeting Q3 2024 - Get Ready for Summer
BGTUG Meeting Q3 2024 - Get Ready for SummerBGTUG Meeting Q3 2024 - Get Ready for Summer
BGTUG Meeting Q3 2024 - Get Ready for Summer
Stanislava Tropcheva
 
SFBA Splunk Usergroup meeting July 17, 2024
SFBA Splunk Usergroup meeting July 17, 2024SFBA Splunk Usergroup meeting July 17, 2024
SFBA Splunk Usergroup meeting July 17, 2024
Becky Burwell
 
[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024
[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024
[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024
Vietnam Cotton & Spinning Association
 
Tailoring a Seamless Data Warehouse Architecture
Tailoring a Seamless Data Warehouse ArchitectureTailoring a Seamless Data Warehouse Architecture
Tailoring a Seamless Data Warehouse Architecture
GetOnData
 
Selcuk Topal Arbitrum Scientific Report.pdf
Selcuk Topal Arbitrum Scientific Report.pdfSelcuk Topal Arbitrum Scientific Report.pdf
Selcuk Topal Arbitrum Scientific Report.pdf
SelcukTOPAL2
 
Systane Global education training centre
Systane Global education training centreSystane Global education training centre
Systane Global education training centre
AkhinaRomdoni
 
393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf
393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf
393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf
Ladislau5
 
一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理
一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理
一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理
ks1ni2di
 
Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...
Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...
Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...
femim26318
 
Agritech Ecosystem in Indonesia2 2023.pdf
Agritech Ecosystem in Indonesia2 2023.pdfAgritech Ecosystem in Indonesia2 2023.pdf
Agritech Ecosystem in Indonesia2 2023.pdf
SafiraMajory
 

Recently uploaded (20)

Acid Base Practice Test 4- KEY.pdfkkjkjk
Acid Base Practice Test 4- KEY.pdfkkjkjkAcid Base Practice Test 4- KEY.pdfkkjkjk
Acid Base Practice Test 4- KEY.pdfkkjkjk
 
Databricks Vs Snowflake off Page PDF submission.pptx
Databricks Vs Snowflake off Page PDF submission.pptxDatabricks Vs Snowflake off Page PDF submission.pptx
Databricks Vs Snowflake off Page PDF submission.pptx
 
ChessMaster Project Presentation for Batch 1643.pptx
ChessMaster Project Presentation for Batch 1643.pptxChessMaster Project Presentation for Batch 1643.pptx
ChessMaster Project Presentation for Batch 1643.pptx
 
Module-4_Docker_Training Course outline_
Module-4_Docker_Training Course outline_Module-4_Docker_Training Course outline_
Module-4_Docker_Training Course outline_
 
Why You Need Real-Time Data to Compete in E-Commerce
Why You Need  Real-Time Data to Compete in  E-CommerceWhy You Need  Real-Time Data to Compete in  E-Commerce
Why You Need Real-Time Data to Compete in E-Commerce
 
Graph Machine Learning - Past, Present, and Future -
Graph Machine Learning - Past, Present, and Future -Graph Machine Learning - Past, Present, and Future -
Graph Machine Learning - Past, Present, and Future -
 
Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...
Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...
Toward a National Research Platform to Enable Data-Intensive Open-Source Sci...
 
一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理
一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理
一比一原版(uc毕业证书)加拿大卡尔加里大学毕业证如何办理
 
Toward a National Research Platform to Enable Data-Intensive Computing
Toward a National Research Platform to Enable Data-Intensive ComputingToward a National Research Platform to Enable Data-Intensive Computing
Toward a National Research Platform to Enable Data-Intensive Computing
 
Data Storytelling Final Project for MBA 635
Data Storytelling Final Project for MBA 635Data Storytelling Final Project for MBA 635
Data Storytelling Final Project for MBA 635
 
BGTUG Meeting Q3 2024 - Get Ready for Summer
BGTUG Meeting Q3 2024 - Get Ready for SummerBGTUG Meeting Q3 2024 - Get Ready for Summer
BGTUG Meeting Q3 2024 - Get Ready for Summer
 
SFBA Splunk Usergroup meeting July 17, 2024
SFBA Splunk Usergroup meeting July 17, 2024SFBA Splunk Usergroup meeting July 17, 2024
SFBA Splunk Usergroup meeting July 17, 2024
 
[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024
[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024
[VCOSA] Monthly Report - Cotton & Yarn Statistics July 2024
 
Tailoring a Seamless Data Warehouse Architecture
Tailoring a Seamless Data Warehouse ArchitectureTailoring a Seamless Data Warehouse Architecture
Tailoring a Seamless Data Warehouse Architecture
 
Selcuk Topal Arbitrum Scientific Report.pdf
Selcuk Topal Arbitrum Scientific Report.pdfSelcuk Topal Arbitrum Scientific Report.pdf
Selcuk Topal Arbitrum Scientific Report.pdf
 
Systane Global education training centre
Systane Global education training centreSystane Global education training centre
Systane Global education training centre
 
393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf
393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf
393947940-The-Dell-EMC-PowerMax-Family-Overview.pdf
 
一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理
一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理
一比一原版(unb毕业证书)新布伦瑞克大学毕业证如何办理
 
Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...
Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...
Cal Girls Mansarovar Jaipur | 08445551418 | Rajni High Profile Girls Call in ...
 
Agritech Ecosystem in Indonesia2 2023.pdf
Agritech Ecosystem in Indonesia2 2023.pdfAgritech Ecosystem in Indonesia2 2023.pdf
Agritech Ecosystem in Indonesia2 2023.pdf
 

Understanding Bagging and Boosting

  • 2. Understanding Bagging and Boosting Both are ensemble techniques, where a set of weak learners are combined to create a strong learner that obtains better performance than a single one. Error = Bias + Variance + Noise
  • 3. Bagging short for Bootstrap Aggregating It’s a way to increase accuracy by Decreasing Variance Done by Generating additional dataset using combinations with repetitions to produce multisets of same cardinality/size as original dataset. Example: Random Forest Develops fully grown decision trees (low bias high variance) which are uncorrelated to maximize the decrease in variance. Since cannot reduce bias therefore req. large unpruned trees.
  • 4. Boosting It’s a way to increase accuracy by Reducing Bias 2- step Process Done by Develop averagely performing models over subsets of the original data. Boost these model performance by combining them using a cost function (eg.majority vote). Note: every subsets contains elements that were misclassified or were close by the previous model. Example: Gradient Boosted Tree Develops shallow decision trees (high bias low variance) aka weak larner. Reduce error mainly by reducing bias developing new learner taking into account the previous learner (Sequential).
  • 10. Comparison Both are ensemble methods to get N learners from 1 learner… … but, while they are built independently for Bagging, Boosting tries to add new models that do well where previous models fail. Both generate several training data sets by random sampling… … but only Boosting determines weights for the data to tip the scales in favor of the most difficult cases. Both make the final decision by averaging the N learners (or taking the majority of them)… … but it is an equally weighted average for Bagging and a weighted average for Boosting, more weight to those with better performance on training data. Both are good at reducing variance and provide higher stability… … but only Boosting tries to reduce bias. On the other hand, Bagging may solve the overfitting problem, while Boosting can increase it. Similarities Differences
  • 11. Exploring the Scope of Supervised Learning in Current Setup Areas where Supervised Learning can be useful Feature Selection for Clustering Evaluating Features Increasing the Aggressiveness of the Current setup Bringing New Rules Idea
  • 12. Feature Selection/ Feature Importance & Model Accuracy and Threshold Evaluation Algorithm Used Feature Importance Metric XGBoost F Score Random Forest Gini Index, Entropy
  • 16. Feature Selection/ Importance Comparison b/w Important Feature by Random Forest & XGBoost feature_21w feature_sut feature_du1 feature_sc3 feature_drh feature_1a2 feature_sc18 feature_drl feature_snc feature_sc1 feature_2c3 feature_npb feature_3e1 feature_bst feature_nub RF - Entropy feature_sut feature_sc3 feature_21w feature_sc18 feature_du1 feature_sc1 feature_drh feature_drl feature_1a2 feature_snc feature_npb feature_3e1 feature_tbu feature_nub feature_bst RF - GiniXGBoost - F Score feature_1a2 feature_2c3 feature_hhs feature_nrp feature_urh feature_nub feature_nup feature_psc feature_sncp feature_3e1 feature_tpa feature_snc feature_bst feature_tbu feature_nub Analysis of Top 15 important variable
  • 17. Feature Selection/ Importance Comparison b/w Important Feature by Random Forest & XGBoost Reason for difference in Feature Importance b/w XGB & RF Basically, when there are several correlated features, boosting will tend to choose one and use it in several trees (if necessary). Other correlated features won t be used a lot (or not at all). It makes sense as other correlated features can't help in the split process anymore -> they don't bring new information regarding the already used feature. And the learning is done in a serial way. Each tree of a Random forest is not built from the same features (there is a random selection of features to use for each tree). Each correlated feature may have the chance to be selected in one of the tree. Therefore, when you look at the whole model it has used all features. The learning is done in parallel so each tree is not aware of what have been used for other trees. Tree Growth XGB When you grow too many trees, trees are starting to be look very similar (when there is no loss remaining to learn). Therefore the dominant feature will be an even more important. Having shallow trees reinforce this trend because there are few possible important features at the root of a tree (shared features between trees are most of the time the one at the root of it). So your results are not surprising. In this case, you may have interesting results with random selection of columns (rate around 0.8). Decreasing ETA may also help (keep more loss to explain after each iteration).
  • 18. Model Accuracy and Threshold Evaluation XGBoost
  • 19. Model Accuracy and Threshold Evaluation XGBoost A A A A BB B B
  • 20. Model Accuracy and Threshold Evaluation XGBoost A A A A BB B B
  • 21. Model Accuracy and Threshold Evaluation XGBoost A A A A BB B B
  • 22. Model Accuracy and Threshold Evaluation XGBoost A A A A BB B B
  • 23. Model Accuracy and Threshold Evaluation XGBoost A A A A BB B B
  • 24. Model Accuracy and Threshold Evaluation XGBoost Threshold Accuracy TN FP FN TP 0 0.059% 0 46990 0 2936 0.1 87.353% 42229 4761 1553 1383 0.2 93.881% 46075 915 2140 796 0.3 94.722% 46691 299 2336 600 0.4 94.894% 46866 124 2425 511 0.5 94.902% 46923 67 2478 458 0.6 94.866% 46956 34 2529 407 0.7 94.856% 46973 17 2551 385 0.8 94.824% 46977 13 2571 365 0.9 94.776% 46982 8 2600 336 1 94.119% 46990 0 2936 0 A A B B
  • 25. Model Accuracy and Threshold Evaluation Random Forest Criteria - Gini Index Random Forest Criteria - Entropy Criteria Accuracy TN FP FN TP Gini 94.800% 46968 22 2574 362 Entropy 94.788% 46967 23 2579 357 A A A A BB B B
  • 26. Model Accuracy and Threshold Evaluation Comparison b/w Random Forest & XGBoost Criteria Accuracy TN FP FN TP Gini 94.800% 46968 22 2574 362 Entropy 94.788% 46967 23 2579 357 Threshold Accuracy TN FP FN TP 0 0.059% 0 46990 0 2936 0.1 87.353% 42229 4761 1553 1383 0.2 93.881% 46075 915 2140 796 0.3 94.722% 46691 299 2336 600 0.4 94.894% 46866 124 2425 511 0.5 94.902% 46923 67 2478 458 0.6 94.866% 46956 34 2529 407 0.7 94.856% 46973 17 2551 385 0.8 94.824% 46977 13 2571 365 0.9 94.776% 46982 8 2600 336 1 94.119% 46990 0 2936 0
  • 27. Bringing New Rules Idea Comparison b/w Random Forest & XGBoost
  • 28. Bringing New Rules Idea Comparison b/w Random Forest & XGBoost