Geovani Nunes Grapiglia

Orcid: 0000-0003-3284-3371

According to our database1, Geovani Nunes Grapiglia authored at least 23 papers between 2015 and 2025.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2025
Universal nonmonotone line search method for nonconvex multiobjective optimization problems with convex constraints.
Comput. Appl. Math., February, 2025

2024
Worst-case evaluation complexity of a derivative-free quadratic regularization method.
Optim. Lett., January, 2024

2023
Adaptive Third-Order Methods for Composite Convex Optimization.
SIAM J. Optim., September, 2023

Worst-case evaluation complexity of a quadratic penalty method for nonconvex optimization.
Optim. Methods Softw., July, 2023

Quadratic regularization methods with finite-difference gradient approximations.
Comput. Optim. Appl., July, 2023

An Adaptive Riemannian Gradient Method Without Function Evaluations.
J. Optim. Theory Appl., June, 2023

A subgradient method with non-monotone line search.
Comput. Optim. Appl., March, 2023

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians.
CoRR, 2023

2022
Tensor methods for finding approximate stationary points of convex functions.
Optim. Methods Softw., 2022

A cubic regularization of Newton's method with finite difference Hessian approximations.
Numer. Algorithms, 2022

An adaptive trust-region method without function evaluations.
Comput. Optim. Appl., 2022

2021
On inexact solution of auxiliary problems in tensor methods for convex optimization.
Optim. Methods Softw., 2021

A generalized worst-case complexity analysis for non-monotone line searches.
Numer. Algorithms, 2021

Worst-case evaluation complexity of derivative-free nonmonotone line search methods for solving nonlinear systems of equations.
Comput. Appl. Math., 2021

2020
Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives.
SIAM J. Optim., 2020

A subspace version of the Wang-Yuan Augmented Lagrangian-Trust Region method for equality constrained optimization.
Appl. Math. Comput., 2020

2019
Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions.
SIAM J. Optim., 2019

Improved optimization methods for image registration problems.
Numer. Algorithms, 2019

2017
Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians.
SIAM J. Optim., 2017

On the worst-case evaluation complexity of non-monotone line search algorithms.
Comput. Optim. Appl., 2017

2016
On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization.
Optim. Methods Softw., 2016

Nonlinear Stepsize Control Algorithms: Complexity Bounds for First- and Second-Order Optimality.
J. Optim. Theory Appl., 2016

2015
On the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimization.
Math. Program., 2015


  Loading...