David Holzmüller

Orcid: 0000-0002-9443-0049

According to our database1, David Holzmüller authored at least 11 papers between 2017 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Active Learning for Neural PDE Solvers.
CoRR, 2024

Better by Default: Strong Pre-Tuned MLPs and Boosted Trees on Tabular Data.
CoRR, 2024

2023
Regression from linear models to neural networks: double descent, active learning, and sampling.
PhD thesis, 2023

A Framework and Benchmark for Deep Batch Active Learning for Regression.
J. Mach. Learn. Res., 2023

Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation.
CoRR, 2023

Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

2022
Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent.
J. Mach. Learn. Res., 2022

2021
On the Universality of the Double Descent Peak in Ridgeless Regression.
Proceedings of the 9th International Conference on Learning Representations, 2021

2020
Muscles Reduce Neuronal Information Load: Quantification of Control Effort in Biological vs. Robotic Pointing and Walking.
Frontiers Robotics AI, 2020

2017
Improved Approximation Schemes for the Restricted Shortest Path Problem.
CoRR, 2017

Efficient Neighbor-Finding on Space-Filling Curves.
CoRR, 2017


  Loading...