Data di Pubblicazione:
1995
Abstract:
A new technique, called Sequential Window Learning
(SWL), for the construction of two-layer perceptrons
with binary inputs is presented.
It generates the number of hidden neurons together with
the correct values for the weights, starting from any binary
training set. The introduction of a new type of neuron,
having a window-shaped activation function, considerably
increases the convergence speed and the compactness of resulting
networks.
Furthermore, a preprocessing technique, called Hamming
Clustering (HC), is proposed for improving the generalization
ability of constructive algorithms for binary feedforward
neural networks. Its insertion in the Sequential Window
Learning is straightforward.
Tests on classical benchmarks show the good performances
of the proposed techniques, both in terms of network complexity
and recognition accuracy.
Tipologia CRIS:
01.01 Articolo in rivista
Elenco autori:
Muselli, Marco
Link alla scheda completa:
Pubblicato in: