Stefano Favaro

According to our database1, Stefano Favaro authored at least 27 papers between 2014 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Random measure priors in Bayesian recovery from sketches.
J. Mach. Learn. Res., 2024

Function-Space MCMC for Bayesian Wide Neural Networks.
CoRR, 2024

Improved prediction of future user activity in online A/B testing.
CoRR, 2024

A Nonparametric Bayes Approach to Online Activity Prediction.
CoRR, 2024

2023
Conformal Frequency Estimation using Discrete Sketched Data with Coverage for Distinct Queries.
J. Mach. Learn. Res., 2023

Learning-augmented count-min sketches via Bayesian nonparametrics.
J. Mach. Learn. Res., 2023

Frequency and cardinality recovery from sketched data: a novel approach bridging Bayesian and frequentist views.
CoRR, 2023

Quantitative CLTs in Deep Neural Networks.
CoRR, 2023

Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities.
CoRR, 2023

2022
Infinitely wide limits for deep Stable neural networks: sub-linear, linear and super-linear activation functions.
Trans. Mach. Learn. Res., 2022

Neural tangent kernel analysis of shallow α-Stable ReLU neural networks.
CoRR, 2022

Conformal Frequency Estimation with Sketched Data.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

2021
Doubly infinite residual neural networks: a diffusion process approach.
J. Mach. Learn. Res., 2021

Consistent estimation of small masses in feature sampling.
J. Mach. Learn. Res., 2021

Deep Stable neural networks: large-width asymptotics and convergence rates.
CoRR, 2021

Infinite-channel deep stable convolutional neural networks.
CoRR, 2021

Large-width functional asymptotics for deep Gaussian neural networks.
Proceedings of the 9th International Conference on Learning Representations, 2021

A Bayesian nonparametric approach to count-min sketch under power-law data streams.
Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, 2021

2020
Doubly infinite residual networks: a diffusion process approach.
CoRR, 2020

Stable behaviour of infinitely wide deep neural networks.
Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics, 2020

Infinitely deep neural networks as diffusion processes.
Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics, 2020

2019
Neural Stochastic Differential Equations.
CoRR, 2019

2015
On a class of σ-stable Poisson-Kingman models and an effective marginalized sampler.
Stat. Comput., 2015

Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process?
IEEE Trans. Pattern Anal. Mach. Intell., 2015

A hybrid sampler for Poisson-Kingman mixture models.
Proceedings of the Advances in Neural Information Processing Systems 28: Annual Conference on Neural Information Processing Systems 2015, 2015

2014
Discussion of "On simulation and properties of the stable law" by L. Devroye and L. James.
Stat. Methods Appl., 2014

Posterior analysis of rare variants in Gibbs-type species sampling models.
J. Multivar. Anal., 2014


  Loading...