Nikita Doikov

Orcid: 0000-0003-1141-1625

According to our database1, Nikita Doikov authored at least 22 papers between 2018 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Super-Universal Regularized Newton Method.
SIAM J. Optim., March, 2024

Gradient regularization of Newton method with Bregman distances.
Math. Program., March, 2024

Improving Stochastic Cubic Newton with Momentum.
CoRR, 2024

Cubic regularized subspace Newton for non-convex optimization.
CoRR, 2024

On Convergence of Incremental Gradient for Non-convex Smooth Functions.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

Spectral Preconditioning for Gradient Methods on Graded Non-convex Functions.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

2023
Affine-invariant contracting-point methods for Convex Optimization.
Math. Program., March, 2023

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians.
CoRR, 2023

Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method.
CoRR, 2023

Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders.
CoRR, 2023

Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods.
CoRR, 2023

Polynomial Preconditioning for Gradient Methods.
Proceedings of the International Conference on Machine Learning, 2023

Second-Order Optimization with Lazy Hessians.
Proceedings of the International Conference on Machine Learning, 2023

Linearization Algorithms for Fully Composite Optimization.
Proceedings of the Thirty Sixth Annual Conference on Learning Theory, 2023

2022
High-Order Optimization Methods for Fully Composite Problems.
SIAM J. Optim., September, 2022

Local convergence of tensor methods.
Math. Program., 2022

2021
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method.
J. Optim. Theory Appl., 2021

2020
Contracting Proximal Methods for Smooth Convex Optimization.
SIAM J. Optim., 2020

Convex optimization based on global lower second-order models.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

Stochastic Subspace Cubic Newton Method.
Proceedings of the 37th International Conference on Machine Learning, 2020

Inexact Tensor Methods with Dynamic Accuracies.
Proceedings of the 37th International Conference on Machine Learning, 2020

2018
Randomized Block Cubic Newton Method.
Proceedings of the 35th International Conference on Machine Learning, 2018


  Loading...