Nicolas Loizou

Orcid: 0000-0003-4359-6492

According to our database1, Nicolas Loizou authored at least 37 papers between 2015 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Dissipative Gradient Descent Ascent Method: A Control Theory Inspired Algorithm for Min-Max Optimization.
IEEE Control. Syst. Lett., 2024

Stochastic Polyak Step-sizes and Momentum: Convergence Guarantees and Practical Performance.
CoRR, 2024

Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad.
CoRR, 2024

Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

Stochastic Extragradient with Random Reshuffling: Improved Convergence for Variational Inequalities.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2024

2023
Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization.
J. Optim. Theory Appl., November, 2023

AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods.
Trans. Mach. Learn. Res., 2023

Stochastic Mirror Descent: Convergence Analysis and Adaptive Variants via the Mirror Stochastic Polyak Stepsize.
Trans. Mach. Learn. Res., 2023

Locally Adaptive Federated Learning via Stochastic Polyak Stepsizes.
CoRR, 2023

Single-Call Stochastic Extragradient Methods for Structured Non-monotone Variational Inequalities: Improved Analysis under Weaker Conditions.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

A Unified Approach to Reinforcement Learning, Quantal Response Equilibria, and Two-Player Zero-Sum Games.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2023

2022
Dynamics of SGD with Stochastic Polyak Stepsizes: Truly Adaptive Variants and Convergence to Exact Solution.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

On the Convergence of Stochastic Extragradient for Bilinear Games using Restarted Iteration Averaging.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2022

Extragradient Method: O(1/K) Last-Iterate Convergence for Monotone Variational Inequalities and Connections With Cocoercivity.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2022

Stochastic Extragradient: General Analysis and Improved Rates.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2022

2021
Revisiting Randomized Gossip Algorithms: General Framework, Convergence Rates and Novel Block and Accelerated Protocols.
IEEE Trans. Inf. Theory, 2021

On the Convergence of Stochastic Extragradient for Bilinear Games with Restarted Iteration Averaging.
CoRR, 2021

AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods.
CoRR, 2021

Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth Games: Convergence Analysis under Expected Co-coercivity.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence.
Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, 2021

SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation.
Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, 2021

2020
Convergence Analysis of Inexact Randomized Iterative Methods.
SIAM J. Sci. Comput., 2020

Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods.
Comput. Optim. Appl., 2020

Stochastic Hamiltonian Gradient Methods for Smooth Games.
Proceedings of the 37th International Conference on Machine Learning, 2020

A Unified Theory of Decentralized SGD with Changing Topology and Local Updates.
Proceedings of the 37th International Conference on Machine Learning, 2020

2019
Randomized Iterative Methods for Linear Systems: Momentum, Inexactness and Gossip.
CoRR, 2019

SGD: General Analysis and Improved Rates.
CoRR, 2019

A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion.
CoRR, 2019

SGD with Arbitrary Sampling: General Analysis and Improved Rates.
Proceedings of the 36th International Conference on Machine Learning, 2019

Stochastic Gradient Push for Distributed Deep Learning.
Proceedings of the 36th International Conference on Machine Learning, 2019

Provably Accelerated Randomized Gossip Algorithms.
Proceedings of the IEEE International Conference on Acoustics, 2019

2018
Accelerated Gossip via Stochastic Heavy Ball Method.
Proceedings of the 56th Annual Allerton Conference on Communication, 2018

2017
Linearly convergent stochastic heavy ball method for minimizing generalization error.
CoRR, 2017

2016
Distributionally Robust Games with Risk-averse Players.
Proceedings of 5th the International Conference on Operations Research and Enterprise Systems (ICORES 2016), 2016

A new perspective on randomized gossip algorithms.
Proceedings of the 2016 IEEE Global Conference on Signal and Information Processing, 2016

2015
Distributionally Robust Game Theory.
CoRR, 2015


  Loading...