 # Ensemble: Bagging, Random Forest, Boosting and Stacking

An ensemble of trees (in the form of bagging, random forest, or boosting) is usually preferred over one decision tree alone. Continue reading Ensemble: Bagging, Random Forest, Boosting and Stacking # Binary Classification Evaluation Summary

This article attempts to summarize the popular evaluation metrics for binary classification problems. Continue reading Binary Classification Evaluation Summary # Precision-Recall curve: an overview

We introduce an alternative for the ROC: the Precision-Recall curve (PR-curve), which is a more reliable measurement for the cases when Positive samples are rare. Continue reading Precision-Recall curve: an overview # ROC curve and AUC: a comprehensive overview

The well-known ROC curve plot, the Area Under the ROC Curve (AUC), and its variants. Continue reading ROC curve and AUC: a comprehensive overview # Information Gain, Gain Ratio and Gini Index

Information Gain, Gain Ratio and Gini Index are the three fundamental criteria to measure the quality of a split in Decision Tree. Continue reading Information Gain, Gain Ratio and Gini Index In the previous blogs, we have discussed Logistic Regression and its assumptions. Today, the main topic is the theoretical and empirical goods and bads of this model. Continue reading Logistic Regression: Advantages and Disadvantages # Naive Bayes classifier: a comprehensive guide

In this blog post, we show and explain the Bayes formula, how to build a Naive Bayes classifier, its assumptions, strengths, and weakness. Continue reading Naive Bayes classifier: a comprehensive guide # Assumptions of Logistic Regression

When these requirements, or assumptions, hold true, we know that our Logistic model has expressed the best performance it can. Continue reading Assumptions of Logistic Regression # Logistic Regression tutorial

Following the previous overview, this article attempts to delve deeper into Logistic Regression. Continue reading Logistic Regression tutorial