Skip to Main Content (Press Enter)

Logo CNR
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills

UNI-FIND
Logo CNR

|

UNI-FIND

cnr.it
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills
  1. Outputs

An incremental decomposition method for unconstrained optimization

Academic Article
Publication Date:
2014
abstract:
In this work we consider the problem of minimizing a sum of continuously differentiable functions. The vector of variables is partitioned into two blocks, and we assume that the objective function is convex with respect to a block-component. Problems with this structure arise, for instance, in machine learning. In order to advantageously exploit the structure of the objective function and to take into account that the number of terms of the objective function may be huge, we propose a decomposition algorithm combined with a gradient incremental strategy. Global convergence of the proposed algorithm is proved. The results of computational experiments performed on large-scale real problems show the effectiveness of the proposed approach with respect to existing algorithms. (C) 2014 Elsevier Inc. All rights reserved.
Iris type:
01.01 Articolo in rivista
Keywords:
Large-scale unconstrained optimization; Decomposition; Gradient incremental methods
List of contributors:
Sciandrone, Marco
Handle:
https://iris.cnr.it/handle/20.500.14243/222232
Published in:
APPLIED MATHEMATICS AND COMPUTATION
Journal
  • Use of cookies

Powered by VIVO | Designed by Cineca | 26.5.0.0 | Sorgente dati: PREPROD (Ribaltamento disabilitato)