About: This paper introduces the Dendrite Spherical Neuron (DSN) as an alternative to the Dendrite Ellipsoidal Neuron (DEN), in which hyperspheres group the patterns from different classes instead of hyperellipses. The reasoning behind DSN is simplifying the computation of DEN architecture, where a centroid and covariance matrix are two dendritic parameters, whereas, in DSN, the covariance matrix is replaced by a radius. This modification is useful to avoid singular covariance matrices since DEN requires measuring the Mahalanobis distance to classify patterns. The DSN training consists of determining the centroids of dendrites with the k-means algorithm, followed by calculating the radius of dendrites as the mean distance to the two nearest centroids, and finally determining the weights of a softmax function, with Stochastic Gradient Descent, at the output of the neuron. Besides, the Simulated Annealing automatically determines the number of dendrites that maximizes the classification accuracy. The DSN is applied to synthetic and real-world datasets. The experimental results reveal that DSN is competitive with Multilayer Perceptron (MLP) networks, with less complex architectures. Also, DSN tends to outperform the Dendrite Morphological Neuron (DMN), which uses hyperboxes. These findings suggest that the DSN is a potential alternative to MLP and DMN for pattern classification tasks.   Goto Sponge  NotDistinct  Permalink

An Entity of Type : fabio:Abstract, within Data Space : covidontheweb.inria.fr associated with source document(s)

AttributesValues
type
value
  • This paper introduces the Dendrite Spherical Neuron (DSN) as an alternative to the Dendrite Ellipsoidal Neuron (DEN), in which hyperspheres group the patterns from different classes instead of hyperellipses. The reasoning behind DSN is simplifying the computation of DEN architecture, where a centroid and covariance matrix are two dendritic parameters, whereas, in DSN, the covariance matrix is replaced by a radius. This modification is useful to avoid singular covariance matrices since DEN requires measuring the Mahalanobis distance to classify patterns. The DSN training consists of determining the centroids of dendrites with the k-means algorithm, followed by calculating the radius of dendrites as the mean distance to the two nearest centroids, and finally determining the weights of a softmax function, with Stochastic Gradient Descent, at the output of the neuron. Besides, the Simulated Annealing automatically determines the number of dendrites that maximizes the classification accuracy. The DSN is applied to synthetic and real-world datasets. The experimental results reveal that DSN is competitive with Multilayer Perceptron (MLP) networks, with less complex architectures. Also, DSN tends to outperform the Dendrite Morphological Neuron (DMN), which uses hyperboxes. These findings suggest that the DSN is a potential alternative to MLP and DMN for pattern classification tasks.
Subject
  • Exponentials
  • Neurohistology
  • Similarity and distance measures
  • Neuroplasticity
  • Gradient methods
part of
is abstract of
is hasSource of
Faceted Search & Find service v1.13.91 as of Mar 24 2020


Alternative Linked Data Documents: Sponger | ODE     Content Formats:       RDF       ODATA       Microdata      About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data]
OpenLink Virtuoso version 07.20.3229 as of Jul 10 2020, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (94 GB total memory)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software