An efficient semi-sigmoidal non-linear activation function approach for deep neural networks

A non-linear activation function is one of the key contributing factors to the success of Deep Learning (DL). Since the revival of DL takes place in 2012, Rectified Linear Unit (ReLU) has been regarded as a de facto standard for many DL models by the community. Despite its popularity, however, Re...

Full description

Saved in:
Bibliographic Details
Main Author: Chieng, Hock Hung
Format: Thesis
Language:English
English
English
Published: 2022
Subjects:
Online Access:http://eprints.uthm.edu.my/8409/1/24p%20CHIENG%20HOCK%20HUNG.pdf
http://eprints.uthm.edu.my/8409/2/CHIENG%20HOCK%20HUNG%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/8409/3/CHIENG%20HOCK%20HUNG%20WATERMARK.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!