Publication Date:
2014
abstract:
The classical machine learning problem of estimating an unknown function through an empirical risk minimization (ERM) procedure is addressed when models based on local evaluation of the output are employed and there is freedom to sample the input space according to some deterministic rule. The combined use of lattice point sets, commonly employed for numerical integration, and local models based on kernel smoothers of the Nadaraya-Watson kind are analyzed regarding consistency of the ERM procedure. It is proved that the regular structure of lattice sampling guarantees the latter with good convergence rates. Furthermore, it is shown how the regular structure allows also practical advantages, like fast computation of the model output. Simulation tests are presented to showcase the behavior of Nadaraya-Watson models with lattice sampling in various function learning problems.
Iris type:
04.01 Contributo in Atti di convegno
Keywords:
Artificial intelligence; Learning systems
List of contributors:
Marcialis, Roberto; Cervellera, Cristiano; Maccio', Danilo; Gaggero, Mauro
Book title:
Proceedings of the International Joint Conference on Neural Networks
Published in: