Skip to Main Content (Press Enter)

Logo CNR
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills

UNI-FIND
Logo CNR

|

UNI-FIND

cnr.it
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills
  1. Outputs

Dynamic smoothness parameter for fast gradient methods

Academic Article
Publication Date:
2018
abstract:
We present and computationally evaluate a variant of the fast gradient method by Nesterov that is capable of exploiting information, even if approximate, about the optimal value of the problem. This information is available in some applications, among which the computation of bounds for hard integer programs. We show that dynamically changing the smoothness parameter of the algorithm using this information results in a better convergence profile of the algorithm in practice.
Iris type:
01.01 Articolo in rivista
Keywords:
Convex optimization; Fast gradient method; Lagrangian relaxation
List of contributors:
Frangioni, Antonio
Handle:
https://iris.cnr.it/handle/20.500.14243/329250
Published in:
OPTIMIZATION LETTERS
Journal
  • Overview

Overview

URL

http://www.scopus.com/inward/record.url?eid=2-s2.0-85025162051&partnerID=q2rCbXpz
  • Use of cookies

Powered by VIVO | Designed by Cineca | 26.5.0.0 | Sorgente dati: PREPROD (Ribaltamento disabilitato)