Isao Ishikawa

Orcid: 0000-0002-3100-6187

According to our database1, Isao Ishikawa authored at least 29 papers between 2018 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Koopman Spectrum Nonlinear Regulators and Efficient Online Learning.
Trans. Mach. Learn. Res., 2024

Constructive Universal Approximation Theorems for Deep Joint-Equivariant Networks by Schur's Lemma.
CoRR, 2024

Finite-dimensional approximations of push-forwards on locally analytic functionals and truncation of least-squares polynomials.
CoRR, 2024

Koopman operators with intrinsic observables in rigged reproducing kernel Hilbert spaces.
CoRR, 2024

A unified Fourier slice method to derive ridgelet transform for a variety of depth-2 neural networks.
CoRR, 2024

Koopman-based generalization bound: New aspect for full-rank weights.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

2023
Generalized Eigenvalues of the Perron-Frobenius Operators of Symbolic Dynamical Systems.
SIAM J. Appl. Dyn. Syst., December, 2023

Universal Approximation Property of Invertible Neural Networks.
J. Mach. Learn. Res., 2023

Joint Group Invariant Functions on Data-Parameter Domain Induce Universal Neural Networks.
CoRR, 2023

Deep Ridgelet Transform: Voice with Koopman Operator Proves Universality of Formal Deep Networks.
CoRR, 2023

Koopman-Based Bound for Generalization: New Aspect of Neural Networks Regarding Nonlinear Noise Filtering.
CoRR, 2023

2022
Dynamic Structure Estimation from Bandit Feedback.
CoRR, 2022

Universality of Group Convolutional Neural Networks Based on Ridgelet Analysis on Groups.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Fully-Connected Network on Noncompact Symmetric Space and Ridgelet Transform based on Helgason-Fourier Analysis.
Proceedings of the International Conference on Machine Learning, 2022

2021
Reproducing kernel Hilbert C*-module and kernel mean embeddings.
J. Mach. Learn. Res., 2021

Koopman Spectrum Nonlinear Regulator and Provably Efficient Online Learning.
CoRR, 2021

Ghosts in Neural Networks: Existence, Structure and Role of Infinite-Dimensional Null Space.
CoRR, 2021

Ridge Regression with Over-parametrized Two-Layer Networks Converge to Ridgelet Spectrum.
Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, 2021

2020
Krylov Subspace Method for Nonlinear Dynamical Systems with Random Noise.
J. Mach. Learn. Res., 2020

Universal Approximation Property of Neural Ordinary Differential Equations.
CoRR, 2020

A global universality of two-layer neural networks with ReLU activations.
CoRR, 2020

Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures.
CoRR, 2020

Gradient Descent Converges to Ridgelet Spectrum.
CoRR, 2020

Analysis via Orthonormal Systems in Reproducing Kernel Hilbert C<sup>*</sup>-Modules and Applications.
CoRR, 2020

Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

2019
Metric on random dynamical systems with vector-valued reproducing kernel Hilbert spaces.
CoRR, 2019

2018
Metric on Nonlinear Dynamical Systems with Koopman Operators.
CoRR, 2018

Integral representation of the global minimizer.
CoRR, 2018

Metric on Nonlinear Dynamical Systems with Perron-Frobenius Operators.
Proceedings of the Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, 2018


  Loading...