Skip to content

Tung M Phung's Blog

  • Home
  • About me

Tag: Activation Function

Deep Learning, Machine Learning - Data MiningLeave a comment

ELU activation: A comprehensive analysis

February 16, 2020July 31, 2020 Tung.M.Phung

Through various experiments, ELU is accepted by many researchers as a good successor of the original version (ReLU). Continue reading ELU activation: A comprehensive analysis

Deep Learning, Machine Learning - Data MiningLeave a comment

Batch Normalization and why it works

January 6, 2020January 4, 2021 Tung.M.Phung

Batch Normalization (BatchNorm) is a very frequently used technique in Deep Learning, however, the reason why it works is often interpreted ambiguously. Continue reading Batch Normalization and why it works

Deep Learning, Machine Learning - Data Mining1 Comment

Rectifier Linear Unit (ReLU)

December 28, 2019July 31, 2020 Tung.M.Phung

Despise its simplicity, ReLU previously achieved the top performance over various tasks of modern Machine Learning. Continue reading Rectifier Linear Unit (ReLU)

Deep Learning, Machine Learning - Data MiningLeave a comment

Sigmoid, tanh activations and their loss of popularity

December 27, 2019July 31, 2020 Tung.M.Phung

The sigmoid and tanh activation functions were very frequently used in the past but have been losing popularity in the era of Deep learning. Continue reading Sigmoid, tanh activations and their loss of popularity