Ido Nachum

According to our database1, Ido Nachum authored at least 12 papers between 2017 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Which Algorithms Have Tight Generalization Bounds?
CoRR, 2024

Fantastic Generalization Measures are Nowhere to be Found.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

2022
Finite Littlestone Dimension Implies Finite Information Complexity.
Proceedings of the IEEE International Symposium on Information Theory, 2022

A Johnson-Lindenstrauss Framework for Randomly Initialized CNNs.
Proceedings of the Tenth International Conference on Learning Representations, 2022

2021
Almost-Reed-Muller Codes Achieve Constant Rates for Random Errors.
IEEE Trans. Inf. Theory, 2021

Regularization by Misclassification in ReLU Neural Networks.
CoRR, 2021

2020
On Symmetry and Initialization for Neural Networks.
Proceedings of the LATIN 2020: Theoretical Informatics, 2020

On the Perceptron's Compression.
Proceedings of the Beyond the Horizon of Computability, 2020

2019
Average-Case Information Complexity of Learning.
Proceedings of the Algorithmic Learning Theory, 2019

2018
A Direct Sum Result for the Information Complexity of Learning.
Proceedings of the Conference On Learning Theory, 2018

Learners that Use Little Information.
Proceedings of the Algorithmic Learning Theory, 2018

2017
Learners that Leak Little Information.
CoRR, 2017


  Loading...