Abdurakhmon Sadiev

According to our database1, Abdurakhmon Sadiev authored at least 18 papers between 2020 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Stochastic Gradient Methods with Preconditioned Updates.
J. Optim. Theory Appl., May, 2024

Differentially Private Random Block Coordinate Descent.
CoRR, 2024

Error Feedback under (L<sub>0</sub>,L<sub>1</sub>)-Smoothness: Normalization and Momentum.
CoRR, 2024

SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Non-convex Cross-Device Federated Learning.
CoRR, 2024

A Unified Theory of Stochastic Proximal Point Methods without Smoothness.
CoRR, 2024

Don't Compress Gradients in Random Reshuffling: Compress Gradient Differences.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

High-Probability Convergence for Composite and Distributed Stochastic Minimization and Variational Inequalities with Heavy-Tailed Noise.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

2023
AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods.
Trans. Mach. Learn. Res., 2023

Adaptive Compression for Communication-Efficient Distributed Training.
Trans. Mach. Learn. Res., 2023

High-Probability Bounds for Stochastic Optimization and Variational Inequalities: the Case of Unbounded Variance.
Proceedings of the International Conference on Machine Learning, 2023

2022
Decentralized personalized federated learning: Lower bounds and optimal algorithm for all personalization modes.
EURO J. Comput. Optim., 2022

Communication Acceleration of Local Gradient Methods via an Accelerated Primal-Dual Algorithm with Inexact Prox.
CoRR, 2022

Federated Optimization Algorithms with Random Reshuffling and Gradient Compression.
CoRR, 2022

Communication Acceleration of Local Gradient Methods via an Accelerated Primal-Dual Algorithm with an Inexact Prox.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Optimal Algorithms for Decentralized Stochastic Variational Inequalities.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

2021
Decentralized Personalized Federated Min-Max Problems.
CoRR, 2021

2020
Zeroth-Order Algorithms for Smooth Saddle-Point Problems.
CoRR, 2020

Gradient-Free Methods for Saddle-Point Problem.
CoRR, 2020


  Loading...