Amichai Painsky

Orcid: 0000-0002-5899-5608

According to our database1, Amichai Painsky authored at least 36 papers between 2010 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
A Comparative Analysis of Discrete Entropy Estimators for Large-Alphabet Problems.
Entropy, May, 2024

Neural Joint Entropy Estimation.
IEEE Trans. Neural Networks Learn. Syst., April, 2024

Cross-validated tree-based models for multi-target learning.
Frontiers Artif. Intell., 2024

Distribution Estimation under the Infinity Norm.
CoRR, 2024

2023
Detecting Non-Overlapping Signals with Dynamic Programming.
Entropy, February, 2023

2022
Convergence Guarantees for the Good-Turing Estimator.
J. Mach. Learn. Res., 2022

Feature Importance in Gradient Boosting Trees with Cross-Validation Feature Selection.
Entropy, 2022

Confidence Intervals for Unobserved Events.
CoRR, 2022

A Data-driven Missing Mass Estimation Framework.
Proceedings of the IEEE International Symposium on Information Theory, 2022

2021
Robust Universal Inference.
Entropy, 2021

Refined Convergence Rates of the Good-Turing Estimator.
Proceedings of the IEEE Information Theory Workshop, 2021

2020
Bregman Divergence Bounds and Universality Properties of the Logarithmic Loss.
IEEE Trans. Inf. Theory, 2020

Innovation Representation of Stochastic Processes With Application to Causal Inference.
IEEE Trans. Inf. Theory, 2020

Nonlinear Canonical Correlation Analysis: A Compressed Representation Approach.
Entropy, 2020

2019
Lossless Compression of Random Forests.
J. Comput. Sci. Technol., 2019

2018
Linear Independent Component Analysis Over Finite Fields: Algorithms and Bounds.
IEEE Trans. Signal Process., 2018

An Information-Theoretic Framework for Non-linear Canonical Correlation Analysis.
CoRR, 2018

Lossless (and Lossy) Compression of Random Forests.
CoRR, 2018

Bregman Divergence Bounds and the Universality of the Logarithmic Loss.
CoRR, 2018

MSc Dissertation: Exclusive Row Biclustering for Gene Expression Using a Combinatorial Auction Approach.
CoRR, 2018

Generalized Independent Components Analysis Over Finite Alphabets.
CoRR, 2018

On the Universality of the Logistic Loss Function.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

2017
Large Alphabet Source Coding Using Independent Component Analysis.
IEEE Trans. Inf. Theory, 2017

Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance.
IEEE Trans. Pattern Anal. Mach. Intell., 2017

Gaussian Lower Bound for the Information Bottleneck Limit.
J. Mach. Learn. Res., 2017

2016
Generalized Independent Component Analysis Over Finite Alphabets.
IEEE Trans. Inf. Theory, 2016

Isotonic Modeling with Non-Differentiable Loss Functions with Application to Lasso Regularization.
IEEE Trans. Pattern Anal. Mach. Intell., 2016

Binary independent component analysis: Theory, bounds and algorithms.
Proceedings of the 26th IEEE International Workshop on Machine Learning for Signal Processing, 2016

Compressing Random Forests.
Proceedings of the IEEE 16th International Conference on Data Mining, 2016

A Simple and Efficient Approach for Adaptive Entropy Coding over Large Alphabets.
Proceedings of the 2016 Data Compression Conference, 2016

2015
Universal Compression of Memoryless Sources over Large Alphabets via Independent Component Analysis.
Proceedings of the 2015 Data Compression Conference, 2015

2014
Optimal Set Cover Formulation for Exclusive Row Biclustering of Gene Expression.
J. Comput. Sci. Technol., 2014

Generalized binary independent component analysis.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

2013
Memoryless representation of Markov processes.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

2012
Exclusive Row Biclustering for Gene Expression Using a Combinatorial Auction Approach.
Proceedings of the 12th IEEE International Conference on Data Mining, 2012

2010
First Order Multiple Hypothesis Testing for the Global Nearest Neighbor Data Correlation Approach.
Proceedings of the 40. Jahrestagung der Gesellschaft für Informatik, Service Science - Neue Perspektiven für die Informatik, INFORMATIK 2010, Leipzig, Germany, September 27, 2010


  Loading...