Author: Afghan, Sher; Naumann, Uwe
Title: Interval Adjoint Significance Analysis for Neural Networks Cord-id: 3oeoh7ol Document date: 2020_5_22
ID: 3oeoh7ol
Snippet: Optimal neural network architecture is a very important factor for computational complexity and memory footprints of neural networks. In this regard, a robust pruning method based on interval adjoints significance analysis is presented in this paper to prune irrelevant and redundant nodes from a neural network. The significance of a node is defined as a product of a node’s interval width and an absolute maximum of first-order derivative of that node’s interval. Based on the significance of n
Document: Optimal neural network architecture is a very important factor for computational complexity and memory footprints of neural networks. In this regard, a robust pruning method based on interval adjoints significance analysis is presented in this paper to prune irrelevant and redundant nodes from a neural network. The significance of a node is defined as a product of a node’s interval width and an absolute maximum of first-order derivative of that node’s interval. Based on the significance of nodes, one can decide how much to prune from each layer. We show that the proposed method works effectively on hidden and input layers by experimenting on famous and complex datasets of machine learning. In the proposed method, a node is removed based on its significance and bias is updated for remaining nodes.
Search related documents:
Co phrase search for related documents- absolute value and accuracy increase: 1, 2
- accuracy achieve and activation function: 1
- accuracy achieve and machine learn: 1
Co phrase search for related documents, hyperlinks ordered by date