Batch Normalization and why it works
Batch Normalization (BatchNorm) is a very frequently used technique in Deep Learning, however, the reason why it works is often interpreted ambiguously. Continue reading Batch Normalization and why it works
Batch Normalization (BatchNorm) is a very frequently used technique in Deep Learning, however, the reason why it works is often interpreted ambiguously. Continue reading Batch Normalization and why it works
In the previous blogs, we have discussed Logistic Regression and its assumptions. Today, the main topic is the theoretical and empirical goods and bads of this model. Continue reading Logistic Regression: Advantages and Disadvantages
Following the previous overview, this article attempts to delve deeper into Logistic Regression. Continue reading Logistic Regression tutorial
Train and cross-validate your Linear regression on Python with pre-defined or customized evaluation functions. Continue reading Linear Regression in Python
This article presents the formulas for coming up with the best-fitted linear regression line. Continue reading How to make a Linear Regressor? (theory)
Linear regression is frequently used in practice because of these 7 reasons. Continue reading Advantages of Linear Regression