An efficient semi-sigmoidal non-linear activation function approach for deep neural networks

A non-linear activation function is one of the key contributing factors to the success of Deep Learning (DL). Since the revival of DL takes place in 2012, Rectified Linear Unit (ReLU) has been regarded as a de facto standard for many DL models by the community. Despite its popularity, however, Re...

全面介绍

Saved in:
书目详细资料
主要作者: Chieng, Hock Hung
格式: Thesis
语言:English
English
English
出版: 2022
主题:
在线阅读:http://eprints.uthm.edu.my/8409/1/24p%20CHIENG%20HOCK%20HUNG.pdf
http://eprints.uthm.edu.my/8409/2/CHIENG%20HOCK%20HUNG%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/8409/3/CHIENG%20HOCK%20HUNG%20WATERMARK.pdf
标签: 添加标签
没有标签, 成为第一个标记此记录!