Skip to Main Content (Press Enter)

Logo CNR
  • ×
  • Home
  • Persone
  • Pubblicazioni
  • Strutture
  • Competenze

UNI-FIND
Logo CNR

|

UNI-FIND

cnr.it
  • ×
  • Home
  • Persone
  • Pubblicazioni
  • Strutture
  • Competenze
  1. Pubblicazioni

Evaluating hebbian learning in a semi-supervised setting

Capitolo di libro
Data di Pubblicazione:
2022
Abstract:
We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pre-training stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of available labeled training samples is very limited, and unsupervised pre-training is therefore beneficial. We performed experiments on CIFAR10, CIFAR100, and Tiny ImageNet datasets. Our results show that Hebbian outperforms Variational Auto-Encoder (VAE) pre-training in almost all the cases, with HPCA generally performing better than soft-WTA.
Tipologia CRIS:
02.01 Contributo in volume (Capitolo o Saggio)
Keywords:
Hebbian learning; Deep learning; Semi-supervised; Sample efficiency; Neural networks; Bio-inspired
Elenco autori:
Amato, Giuseppe; Gennaro, Claudio; Falchi, Fabrizio
Autori di Ateneo:
AMATO GIUSEPPE
FALCHI FABRIZIO
GENNARO CLAUDIO
Link alla scheda completa:
https://iris.cnr.it/handle/20.500.14243/431703
Link al Full Text:
https://iris.cnr.it//retrieve/handle/20.500.14243/431703/127705/prod_465268-doc_182673.pdf
https://iris.cnr.it//retrieve/handle/20.500.14243/431703/127709/prod_465268-doc_182719.pdf
Titolo del libro:
Machine Learning, Optimization, and Data Science
  • Dati Generali

Dati Generali

URL

https://link.springer.com/chapter/10.1007/978-3-030-95470-3_28
  • Utilizzo dei cookie

Realizzato con VIVO | Designed by Cineca | 26.5.0.0 | Sorgente dati: PREPROD (Ribaltamento disabilitato)