Simone Brugiapaglia
Orcid: 0000-0003-1927-8232
According to our database1,
Simone Brugiapaglia
authored at least 32 papers
between 2014 and 2025.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2025
Neural Networks, 2025
Near-optimal learning of Banach-valued, high-dimensional functions via deep neural networks.
Neural Networks, 2025
2024
SIAM J. Optim., 2024
Physics-informed deep learning and compressive collocation for high-dimensional diffusion-reaction equations: practical existence theory and numerics.
CoRR, 2024
Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks.
CoRR, 2024
Neural Rank Collapse: Weight Decay and Small Within-Class Variability Yield Low-Rank Bias.
CoRR, 2024
A practical existence theorem for reduced order models based on convolutional autoencoders.
CoRR, 2024
2023
LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing.
SIAM J. Math. Data Sci., December, 2023
The greedy side of the LASSO: New algorithms for weighted sparse recovery via loss function-based orthogonal matching pursuit.
CoRR, 2023
2022
A Coherence Parameter Characterizing Generative Compressed Sensing With Fourier Measurements.
IEEE J. Sel. Areas Inf. Theory, September, 2022
Invariance, Encodings, and Generalization: Learning Identity Effects With Neural Networks.
Neural Comput., 2022
Do Log Factors Matter? On Optimal Wavelet Approximation and the Foundations of Compressed Sensing.
Found. Comput. Math., 2022
Is Monte Carlo a bad sampling strategy for learning smooth functions in high dimensions?
CoRR, 2022
Compressive Fourier collocation methods for high-dimensional diffusion equations with periodic boundary conditions.
CoRR, 2022
On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples.
CoRR, 2022
2021
The Benefits of Acting Locally: Reconstruction Algorithms for Sparse in Levels Signals With Stable and Robust Recovery Guarantees.
IEEE Trans. Signal Process., 2021
Iterative and greedy algorithms for the sparsity in levels model in compressed sensing.
CoRR, 2021
Invariance, encodings, and generalization: learning identity effects with neural networks.
CoRR, 2021
Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data.
Proceedings of the Mathematical and Scientific Machine Learning, 2021
Learning High-Dimensional Hilbert-Valued Functions With Deep Neural Networks From Limited Data.
Proceedings of the AAAI 2021 Spring Symposium on Combining Artificial Intelligence and Machine Learning with Physical Sciences, Stanford, CA, USA, March 22nd - to, 2021
2020
Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs.
CoRR, 2020
Generalizing Outside the Training Set: When Can Neural Networks Learn Identity Effects?
Proceedings of the 42th Annual Meeting of the Cognitive Science Society, 2020
2019
Numerische Mathematik, 2019
2018
Math. Comput., 2018
Sparse approximation of multivariate functions from small datasets via weighted orthogonal matching pursuit.
CoRR, 2018
2015
Compressed solving: A numerical approximation technique for elliptic PDEs based on Compressed Sensing.
Comput. Math. Appl., 2015
2014
J. Comput. Appl. Math., 2014