Description
Metadata
Settings
About:
Optimal neural network architecture is a very important factor for computational complexity and memory footprints of neural networks. In this regard, a robust pruning method based on interval adjoints significance analysis is presented in this paper to prune irrelevant and redundant nodes from a neural network. The significance of a node is defined as a product of a node’s interval width and an absolute maximum of first-order derivative of that node’s interval. Based on the significance of nodes, one can decide how much to prune from each layer. We show that the proposed method works effectively on hidden and input layers by experimenting on famous and complex datasets of machine learning. In the proposed method, a node is removed based on its significance and bias is updated for remaining nodes.
Permalink
an Entity references as follows:
Subject of Sentences In Document
Object of Sentences In Document
Explicit Coreferences
Implicit Coreferences
Graph IRI
Count
http://ns.inria.fr/covid19/graph/entityfishing
8
http://ns.inria.fr/covid19/graph/articles
3
Faceted Search & Find service v1.13.91
Alternative Linked Data Documents:
Sponger
|
ODE
Raw Data in:
CXML
|
CSV
| RDF (
N-Triples
N3/Turtle
JSON
XML
) | OData (
Atom
JSON
) | Microdata (
JSON
HTML
) |
JSON-LD
About
This work is licensed under a
Creative Commons Attribution-Share Alike 3.0 Unported License
.
OpenLink Virtuoso
version 07.20.3229 as of Jul 10 2020, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (94 GB total memory)
Copyright © 2009-2024 OpenLink Software