Hideaki Iiduka
Orcid: 0000-0001-9173-6723
According to our database1,
Hideaki Iiduka
authored at least 50 papers
between 2009 and 2024.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
Online presence:
-
on zbmath.org
-
on scopus.com
-
on orcid.org
On csauthors.net:
Bibliography
2024
J. Optim. Theory Appl., August, 2024
Theoretical analysis of Adam using hyperparameters close to one without Lipschitz smoothness.
Numer. Algorithms, January, 2024
Convergence of Sharpness-Aware Minimization Algorithms using Increasing Batch Size and Decaying Learning Rate.
CoRR, 2024
Increasing Both Batch Size and Learning Rate Accelerates Stochastic Gradient Descent.
CoRR, 2024
Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates.
CoRR, 2024
CoRR, 2024
2023
ϵ-Approximation of Adaptive Leaning Rate Optimization Algorithms for Constrained Nonconvex Stochastic Optimization.
IEEE Trans. Neural Networks Learn. Syst., October, 2023
Using Stochastic Gradient Descent to Smooth Nonconvex Functions: Analysis of Implicit Graduated Optimization with Optimal Noise Scheduling.
CoRR, 2023
Relationship between Batch Size and Number of Steps Needed for Nonconvex Optimization of Stochastic Gradient Descent using Armijo Line Search.
CoRR, 2023
Appl. Math. Comput., 2023
Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule.
Proceedings of the International Conference on Machine Learning, 2023
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2023
2022
Riemannian Adaptive Optimization Algorithm and its Application to Natural Language Processing.
IEEE Trans. Cybern., 2022
Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks.
IEEE Trans. Cybern., 2022
Critical Bach Size Minimizes Stochastic First-Order Oracle Complexity of Deep Learning Optimizer using Hyperparameters Close to One.
CoRR, 2022
Using Constant Learning Rate of Two Time-Scale Update Rule for Training Generative Adversarial Networks.
CoRR, 2022
2021
J. Optim. Theory Appl., 2021
Inexact stochastic subgradient projection method for stochastic equilibrium problems with nonmonotone bifunctions: application to expected risk minimization in machine learning.
J. Glob. Optim., 2021
Minimization of Stochastic First-order Oracle Complexity of Adaptive Methods for Nonconvex Optimization.
CoRR, 2021
The Number of Steps Needed for Nonconvex Optimization of a Deep Learning Optimizer is a Rational Function of Batch Size.
CoRR, 2021
Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks.
IEEE Access, 2021
2020
IEEE Trans. Cybern., 2020
Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning.
CoRR, 2020
Comput. Optim. Appl., 2020
2019
Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions.
IEEE Trans. Control. Netw. Syst., 2019
Two stochastic optimization algorithms for convex optimization with fixed point constraints.
Optim. Methods Softw., 2019
Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments.
Frontiers Robotics AI, 2019
2018
Optimality and convergence for convex ensemble learning with sparsity and diversity based on fixed point optimization.
Neurocomputing, 2018
2016
Incremental subgradient method for nonsmooth convex optimization with fixed point constraints.
Optim. Methods Softw., 2016
Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings.
Math. Program., 2016
Recursive-Rule Extraction Algorithm With J48graft And Applications To Generating Credit Scores.
J. Artif. Intell. Soft Comput. Res., 2016
Proximal point algorithms for nonsmooth convex optimization with fixed point constraints.
Eur. J. Oper. Res., 2016
2015
Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping.
Math. Program., 2015
Convex optimization over fixed point sets of quasi-nonexpansive and nonexpansive mappings in utility-based bandwidth allocation problems with operational constraints.
J. Comput. Appl. Math., 2015
2014
Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms.
SIAM J. Optim., 2014
2013
Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems.
SIAM J. Optim., 2013
2012
Iterative Algorithm for Triple-Hierarchical Constrained Nonconvex Optimization Problem and Its Application to Network Bandwidth Allocation.
SIAM J. Optim., 2012
Computational Method for Solving a Stochastic Linear-Quadratic Control Problem Given an Unsolvable Stochastic Algebraic Riccati Equation.
SIAM J. Control. Optim., 2012
Conjugate gradient methods using value of objective function for unconstrained optimization.
Optim. Lett., 2012
Fixed point optimization algorithm and its application to power control in CDMA data networks.
Math. Program., 2012
Fixed point optimization algorithm and its application to network bandwidth allocation.
J. Comput. Appl. Math., 2012
2011
Decentralized Algorithm for Centralized Variational Inequalities in Network Resource Allocation.
J. Optim. Theory Appl., 2011
Iterative Algorithm for Solving Triple-Hierarchical Constrained Optimization Problem.
J. Optim. Theory Appl., 2011
Fixed Point Optimization Algorithms for Network Bandwidth Allocation Problems with Compoundable Constraints.
IEEE Commun. Lett., 2011
Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping.
Appl. Math. Comput., 2011
2009
A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping.
SIAM J. Optim., 2009
J. Math. Model. Algorithms, 2009