Ioannis Kontoyiannis

Orcid: 0000-0001-7242-6375

Affiliations:
  • Athens University of Economics & Business, Greece


According to our database1, Ioannis Kontoyiannis authored at least 103 papers between 1996 and 2024.

Collaborative distances:

Awards

IEEE Fellow

IEEE Fellow 2011, "For contributions to data compression".

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Context-Tree Weighting and Bayesian Context Trees: Asymptotic and Non-Asymptotic Justifications.
IEEE Trans. Inf. Theory, February, 2024

Temporally Causal Discovery Tests for Discrete Time Series and Neural Spike Trains.
IEEE Trans. Signal Process., 2024

Finite de Finetti bounds in relative entropy.
CoRR, 2024

Finite-sample expansions for the optimal error probability in asymmetric binary hypothesis testing.
CoRR, 2024

Relative entropy bounds for sampling with and without replacement.
CoRR, 2024

The entropic doubling constant and robustness of Gaussian codebooks for additive-noise channels.
CoRR, 2024

Causality Testing, Directed Information and Spike Trains.
Proceedings of the IEEE International Symposium on Information Theory, 2024

The Optimal Finite-Sample Error Probability in Asymmetric Binary Hypothesis Testing.
Proceedings of the IEEE International Symposium on Information Theory, 2024

A Third Information-Theoretic Approach to Finite de Finetti Theorems.
Proceedings of the IEEE International Symposium on Information Theory, 2024

2023
A Third Information-Theoretic Approach to Finite de Finetti Theorems.
CoRR, 2023

Truly Bayesian Entropy Estimation.
Proceedings of the IEEE Information Theory Workshop, 2023

Context-tree weighting for real-valued time series: Bayesian inference with hierarchical mixture models.
Proceedings of the IEEE International Symposium on Information Theory, 2023

Time Series Analysis with Bayesian Context Trees: Classical Asymptotics and Finite-n Bounds.
Proceedings of the IEEE International Symposium on Information Theory, 2023

2022
Information in probability: Another information-theoretic proof of a finite de Finetti theorem.
CoRR, 2022

Posterior Representations for Bayesian Context Trees: Sampling, Estimation and Convergence.
CoRR, 2022

Compression and symmetry of small-world graphs and structures.
Commun. Inf. Syst., 2022

Bayesian Change-Point Detection via Context-Tree Weighting.
Proceedings of the IEEE Information Theory Workshop, 2022

Information-theoretic de Finetti-style theorems.
Proceedings of the IEEE Information Theory Workshop, 2022

The Posterior Distribution of Bayesian Context-Tree Models: Theory and Applications.
Proceedings of the IEEE International Symposium on Information Theory, 2022

The Entropic Central Limit Theorem for Discrete Random Variables.
Proceedings of the IEEE International Symposium on Information Theory, 2022

2021
Fundamental Limits of Lossless Data Compression With Side Information.
IEEE Trans. Inf. Theory, 2021

Differential Temporal Difference Learning.
IEEE Trans. Autom. Control., 2021

The ODE Method for Asymptotic Statistics in Stochastic Approximation and Reinforcement Learning.
CoRR, 2021

Entropy and the Discrete Central Limit Theorem.
CoRR, 2021

Inferring community characteristics in labelled networks.
CoRR, 2021

An Information-Theoretic Proof of a Finite de Finetti Theorem.
CoRR, 2021

Revisiting Context-Tree Weighting for Bayesian Inference.
Proceedings of the IEEE International Symposium on Information Theory, 2021

Symmetry and the Entropy of Small-World Structures and Graphs.
Proceedings of the IEEE International Symposium on Information Theory, 2021

2020
Nonasymptotic Gaussian Approximation for Inference With Stable Noise.
IEEE Trans. Inf. Theory, 2020

Packet Speed and Cost in Mobile Wireless Delay-Tolerant Networks.
IEEE Trans. Inf. Theory, 2020

A simple network of nodes moving on the circle∗.
Random Struct. Algorithms, 2020

Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information.
Entropy, 2020

Bayesian Context Trees: Modelling and exact inference for discrete time series.
CoRR, 2020

Lossless Data Compression with Side Information: Nonasymptotics and Dispersion.
Proceedings of the IEEE International Symposium on Information Theory, 2020

2019
The Lévy State Space Model.
Proceedings of the 53rd Asilomar Conference on Signals, Systems, and Computers, 2019

2018
Entropy Bounds on Abelian Groups and the Ruzsa Divergence.
IEEE Trans. Inf. Theory, 2018

Nonasymptotic Gaussian Approximation for Linear Systems with Stable Noise [Preliminary Version].
CoRR, 2018

Analysis of Geographic/Delay-Tolerant Routing in Mobile Wireless Networks.
CoRR, 2018

Deep Tree Models for 'Big' Biological Data.
Proceedings of the 19th IEEE International Workshop on Signal Processing Advances in Wireless Communications, 2018

Analysis of a One-Dimensional Continuous Delay-Tolerant Network Model.
Proceedings of the 19th IEEE International Workshop on Signal Processing Advances in Wireless Communications, 2018

Sharp Gaussian Approximation Bounds for Linear Systems with $\alpha$ -stable Noise.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

Asymptotics of the Packet Speed and Cost in a Mobile Wireless Network Model.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

2017
Exact speed and transmission cost in a simple one-dimensional wireless delay-tolerant network.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

Simulated convergence rates with application to an intractable α-stable inference problem.
Proceedings of the 2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, 2017

2016
Estimating the Directed Information and Testing for Causality.
IEEE Trans. Inf. Theory, 2016

2015
The Ruzsa divergence for random elements in locally compact abelian groups.
CoRR, 2015

2014
Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics.
IEEE Trans. Inf. Theory, 2014

Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information.
IEEE Trans. Inf. Theory, 2014

2013
Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures.
Discret. Appl. Math., 2013

The entropy of sums and Rusza's divergence on abelian groups.
Proceedings of the 2013 IEEE Information Theory Workshop, 2013

Optimal lossless compression: Source varentropy and dispersion.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Lossless compression with moderate error probability.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

2012
Complexity-compression tradeoffs in lossy compression via efficient random codebooks and databases.
Probl. Inf. Transm., 2012

Lossless Data Compression at Finite Blocklengths
CoRR, 2012

Sumset inequalities for differential entropy and mutual information.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

Lossless data compression rate: Asymptotics and non-asymptotics.
Proceedings of the 46th Annual Conference on Information Sciences and Systems, 2012

2010
Thinning, entropy, and the law of thin numbers.
IEEE Trans. Inf. Theory, 2010

Compound Poisson Approximation via Information Functionals
CoRR, 2010

The entropies of the sum and the difference of two IID random variables are not too different.
Proceedings of the IEEE International Symposium on Information Theory, 2010

2009
Lossy Compression in Near-Linear Time via Efficient Random Codebooks and Databases
CoRR, 2009

Efficient random codebooks and databases for lossy compression in near-linear time.
Proceedings of the 2009 IEEE Information Theory Workshop, 2009

A criterion for the compound poisson distribution to be maximum entropy.
Proceedings of the IEEE International Symposium on Information Theory, 2009

2008
Estimation of the Rate-Distortion Function.
IEEE Trans. Inf. Theory, 2008

Information and Complexity in Statistical Modeling by Jorma Rissanen.
Am. Math. Mon., 2008

Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study.
Entropy, 2008

On the entropy and log-concavity of compound Poisson measures
CoRR, 2008

Control variates as screening functions.
Proceedings of the 3rd International ICST Conference on Performance Evaluation Methodologies and Tools, 2008

Counting the primes using entropy.
Proceedings of the 2008 IEEE Information Theory Workshop, 2008

Thinning and information projections.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

2007
Identifying Statistical Dependence in Genomic Sequences via Mutual Information Estimates.
EURASIP J. Bioinform. Syst. Biol., 2007

Some information-theoretic computations related to the distribution of prime numbers
CoRR, 2007

Fisher Information, Compound Poisson Approximation, and the Poisson Channel.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Thinning and the Law of Small Numbers.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Statistical Dependence in Biological Sequences.
Proceedings of the IEEE International Symposium on Information Theory, 2007

2006
Mismatched codebooks and the role of entropy coding in lossy data compression.
IEEE Trans. Inf. Theory, 2006

Exponential bounds and stopping rules for MCMC and general Markov chains.
Proceedings of the 1st International Conference on Performance Evaluation Methodolgies and Tools, 2006

Entropy Estimation: Simulation, Theory and a Case Study.
Proceedings of the 2006 IEEE Information Theory Workshop, 2006

On Estimating the Rate-Distortion Function.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

From the Entropy to the Statistical Structure of Spike Trains.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

2005
Entropy and the law of small numbers.
IEEE Trans. Inf. Theory, 2005

Steady state analysis of balanced-allocation routing.
Random Struct. Algorithms, 2005

Filtering: the case for "noisier" data.
Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, 2005

Concentration and relative entropy for compound Poisson distributions.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

Relative entropy and exponential deviation bounds for general Markov chains.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

Mutual information, synergy and some curious phenomena for simple channels.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

2004
Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration.
Proceedings of the 2004 IEEE Information Theory Workshop, 2004

Minimum description length vs. maximum likelihood in lossy data compression.
Proceedings of the 2004 IEEE International Symposium on Information Theory, 2004

2003
Source coding exponents for zero-delay coding with finite memory.
IEEE Trans. Inf. Theory, 2003

Pattern matching and lossy data compression on random fields.
IEEE Trans. Inf. Theory, 2003

2002
Arbitrary source models and Bayesian codebooks in rate-distortion theory.
IEEE Trans. Inf. Theory, 2002

Source coding, large deviations, and approximate pattern matching.
IEEE Trans. Inf. Theory, 2002

2001
Sphere-covering, measure concentration, and source coding.
IEEE Trans. Inf. Theory, 2001

Critical behavior in lossy source coding.
IEEE Trans. Inf. Theory, 2001

Unified spatial diversity combining and power allocation for CDMA systems in multiple time-scale fading channels.
IEEE J. Sel. Areas Commun., 2001

2000
Pointwise redundancy in lossy data compression and universal lossy data compression.
IEEE Trans. Inf. Theory, 2000

Unified spatial diversity combining and power allocation schemes for CDMA systems.
Proceedings of the Global Telecommunications Conference, 2000. GLOBECOM 2000, San Francisco, CA, USA, 27 November, 2000

1999
An implementable lossy version of the Lempel-Ziv algorithm - Part I: Optimality for memoryless sources.
IEEE Trans. Inf. Theory, 1999

Efficient sphere-covering and converse measure concentration via generalized coding theorems
CoRR, 1999

1998
Nonparametric Entropy Estimation for Stationary Processesand Random Fields, with Applications to English Text.
IEEE Trans. Inf. Theory, 1998

Progressive search and retrieval in large image archives.
IBM J. Res. Dev., 1998

1997
Second-order noiseless source coding theorems.
IEEE Trans. Inf. Theory, 1997

1996
Progressive classification in the compressed domain for large EOS satellite databases.
Proceedings of the 1996 IEEE International Conference on Acoustics, 1996

Stationary Entrophy Estimation via String Matching.
Proceedings of the 6th Data Compression Conference (DCC '96), Snowbird, Utah, USA, March 31, 1996


  Loading...