Quoc Tran-Dinh
Orcid: 0000-0002-5866-0787
According to our database1,
Quoc Tran-Dinh
authored at least 61 papers
between 2013 and 2024.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2024
Extragradient-type methods with $\mathcal {O}\left( 1/k\right) $ last-iterate convergence rates for co-hypomonotone inclusions.
J. Glob. Optim., May, 2024
From Halpern's fixed-point iterations to Nesterov's accelerated interpretations for root-finding problems.
Comput. Optim. Appl., January, 2024
2023
A new randomized primal-dual algorithm for convex optimization with fast last iterate convergence rates.
Optim. Methods Softw., January, 2023
2022
New Primal-Dual Algorithms for a Class of Nonsmooth and Nonlinear Convex-Concave Minimax Problems.
SIAM J. Optim., 2022
A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm.
Optim. Lett., 2022
Math. Program., 2022
Math. Oper. Res., 2022
J. Glob. Optim., 2022
2021
J. Mach. Learn. Res., 2021
Identifying Heterogeneous Effect Using Latent Supervised Clustering With Adaptive Fusion.
J. Comput. Graph. Stat., 2021
A Lyapunov function for the combined system-optimizer dynamics in inexact model predictive control.
Autom., 2021
Minimization of a class of rare event probabilities and buffered probabilities of exceedance.
Ann. Oper. Res., 2021
Improved Complexity Of Trust-Region Optimization For Zeroth-Order Stochastic Oracles with Adaptive Sampling.
Proceedings of the Winter Simulation Conference, 2021
FedDR - Randomized Douglas-Rachford Splitting Algorithms for Nonconvex Federated Composite Optimization.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021
Proceedings of the 38th International Conference on Machine Learning, 2021
Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, 2021
2020
SIAM J. Optim., 2020
Math. Program. Comput., 2020
J. Optim. Theory Appl., 2020
ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization.
J. Mach. Learn. Res., 2020
Asynchronous Federated Learning with Reduced Number of Rounds and with Differential Privacy from Less Aggregated Gaussian Noise.
CoRR, 2020
Comput. Optim. Appl., 2020
Hybrid Variance-Reduced SGD Algorithms For Minimax Problems with Nonconvex-Linear Function.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020
Proceedings of the 37th International Conference on Machine Learning, 2020
Proceedings of the 8th International Conference on Learning Representations, 2020
Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics, 2020
2019
Math. Program. Comput., 2019
Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms.
Math. Program., 2019
Math. Program., 2019
CoRR, 2019
A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization.
CoRR, 2019
Proximal alternating penalty algorithms for nonsmooth constrained convex optimization.
Comput. Optim. Appl., 2019
Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive step-sizes and convergence.
Comput. Optim. Appl., 2019
Proceedings of the 58th IEEE Conference on Decision and Control, 2019
2018
A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization.
SIAM J. Optim., 2018
Proceedings of the Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, 2018
2017
Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization.
Optim. Lett., 2017
Comput. Optim. Appl., 2017
Proceedings of the Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 2017
2016
Proceedings of the Neural Information Processing - 23rd International Conference, 2016
Frank-Wolfe works for non-Lipschitz continuous gradient objectives: Scalable poisson phase retrieval.
Proceedings of the 2016 IEEE International Conference on Acoustics, 2016
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, 2016
2015
Proceedings of the Advances in Neural Information Processing Systems 28: Annual Conference on Neural Information Processing Systems 2015, 2015
Proceedings of the Modelling, Computation and Optimization in Information Systems and Management Sciences - Proceedings of the 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences, 2015
Proceedings of the 23rd European Signal Processing Conference, 2015
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, 2015
2014
IEEE Signal Process. Mag., 2014
SIAM J. Optim., 2014
Computational Complexity of Inexact Gradient Augmented Lagrangian Methods: Application to Constrained MPC.
SIAM J. Control. Optim., 2014
Proceedings of the Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, 2014
Proceedings of the IEEE International Conference on Acoustics, 2014
Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, 2014
2013
A proximal Newton framework for composite minimization: Graph learning without Cholesky decompositions and matrix inversions.
Proceedings of the 30th International Conference on Machine Learning, 2013