Dan Mikulincer

Orcid: 0000-0003-3597-3550

According to our database1, Dan Mikulincer authored at least 13 papers between 2016 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
How to Trap a Gradient Flow.
SIAM J. Comput., 2024

2023
Time Lower Bounds for the Metropolis Process and Simulated Annealing.
CoRR, 2023

Noise Stability on the Boolean Hypercube via a Renormalized Brownian Motion.
Proceedings of the 55th Annual ACM Symposium on Theory of Computing, 2023

Integrality Gaps for Random Integer Programs via Discrepancy.
Proceedings of the 2023 ACM-SIAM Symposium on Discrete Algorithms, 2023

Is This Correct? Let's Check!
Proceedings of the 14th Innovations in Theoretical Computer Science Conference, 2023

2022
Community detection and percolation of information in a geometric setting.
Comb. Probab. Comput., 2022

Size and depth of monotone neural networks: interpolation and approximation.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Archimedes Meets Privacy: On Privately Estimating Quantiles in High Dimensions Under Minimal Assumptions.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

2021
Non-asymptotic approximations of neural networks by Gaussian processes.
Proceedings of the Conference on Learning Theory, 2021

2020
Network size and weights size for memorization with two-layers neural networks.
CoRR, 2020

Network size and size of the weights in memorization with two-layers neural networks.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

2019
Stability of the Shannon-Stam inequality via the Föllmer process.
CoRR, 2019

2016
Information and dimensionality of anisotropic random geometric graphs.
CoRR, 2016


  Loading...