ELU activation: A comprehensive analysis
Through various experiments, ELU is accepted by many researchers as a good successor of the original version (ReLU). Continue reading ELU activation: A comprehensive analysis
Through various experiments, ELU is accepted by many researchers as a good successor of the original version (ReLU). Continue reading ELU activation: A comprehensive analysis
Batch Normalization (BatchNorm) is a very frequently used technique in Deep Learning, however, the reason why it works is often interpreted ambiguously. Continue reading Batch Normalization and why it works
Despise its simplicity, ReLU previously achieved the top performance over various tasks of modern Machine Learning. Continue reading Rectifier Linear Unit (ReLU)