Stephan Wojtowytsch
Orcid: 0000-0003-3766-5332
According to our database1,
Stephan Wojtowytsch
authored at least 24 papers
between 2019 and 2024.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
Online presence:
-
on orcid.org
On csauthors.net:
Bibliography
2024
Stochastic Gradient Descent with Noise of Machine Learning Type Part II: Continuous Time Analysis.
J. Nonlinear Sci., February, 2024
Optimal Bump Functions for Shallow ReLU networks: Weight Decay, Depth Separation, Curse of Dimensionality.
J. Mach. Learn. Res., 2024
SineNet: Learning Temporal Dynamics in Time-Dependent Partial Differential Equations.
Proceedings of the Twelfth International Conference on Learning Representations, 2024
2023
Stochastic Gradient Descent with Noise of Machine Learning Type Part I: Discrete Time Analysis.
J. Nonlinear Sci., June, 2023
A qualitative difference between gradient flows of convex functions in finite- and infinite-dimensional Hilbert spaces.
CoRR, 2023
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023
Proceedings of the International Conference on Machine Learning, 2023
2022
Optimal bump functions for shallow ReLU networks: Weight decay, depth separation and the curse of dimensionality.
CoRR, 2022
Qualitative neural network approximation over R and C: Elementary proofs for analytic and polynomial activation.
CoRR, 2022
2021
On the Motion of Curved Dislocations in Three Dimensions: Simplified Linearized Elasticity.
SIAM J. Math. Anal., 2021
On the emergence of simplex symmetry in the final and penultimate layers of neural network classifiers.
Proceedings of the Mathematical and Scientific Machine Learning, 2021
Some observations on high-dimensional partial differential equations with Barron data.
Proceedings of the Mathematical and Scientific Machine Learning, 2021
2020
Can Shallow Neural Networks Beat the Curse of Dimensionality? A Mean Field Training Perspective.
IEEE Trans. Artif. Intell., 2020
On the emergence of tetrahedral symmetry in the final and penultimate layers of neural network classifiers.
CoRR, 2020
Some observations on partial differential equations in Barron and multi-layer spaces.
CoRR, 2020
Towards a Mathematical Understanding of Neural Network-Based Machine Learning: what we know and what we don't.
CoRR, 2020
On the Banach spaces associated with multi-layer ReLU networks: Function representation, approximation theory and gradient descent dynamics.
CoRR, 2020
On the Convergence of Gradient Descent Training for Two-layer ReLU-networks in the Mean Field Regime.
CoRR, 2020
Kolmogorov Width Decay and Poor Approximators in Machine Learning: Shallow Neural Networks, Random Feature Models and Neural Tangent Kernels.
CoRR, 2020
2019
SIAM J. Math. Anal., 2019