Back to articles
Articles
Volume: 28 | Article ID: art00007
Image
Adaptive Activation Functions for Deep Networks
  DOI :  10.2352/ISSN.2470-1173.2016.19.COIMG-149  Published OnlineFebruary 2016
Abstract

Artificial neural networks loosely mimic the complex web of nearly 100 trillion connections in the human brain. Deep neural networks, and specifically convolutional neural networks, have recently demonstrated breakthrough performances in the pattern recognition community. Studies on the network depth, regularization, filters, choice of activation function, and training parameters are numerous. With regard to activation functions, the rectified linear unit, is favored over the sigmoid and tanh function because the differentiation of larger signals is maintained. This paper introduces multiple activation functions per single neuron. Libraries have been generated to allow individual neurons within a neural network the ability to select between a multitude of activation functions, where the selection of each function is done on a node by node basis to minimize classification error. Each node is able to use more than one activation function if the final classification error can be reduced. The resulting networks have been trained on several commonly used datasets, which show increases in classification performance, and are compared to the recent findings in neuroscience research.

Subject Areas :
Views 90
Downloads 15
 articleview.views 90
 articleview.downloads 15
  Cite this article 

Michael Dushkoff, Raymond Ptucha, "Adaptive Activation Functions for Deep Networksin Proc. IS&T Int’l. Symp. on Electronic Imaging: Computational Imaging XIV,  2016,  https://doi.org/10.2352/ISSN.2470-1173.2016.19.COIMG-149

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2016
72010604
Electronic Imaging
2470-1173
Society for Imaging Science and Technology
7003 Kilworth Lane, Springfield, VA 22151 USA