Peter Harremoës

Orcid: 0000-0002-0441-6690

According to our database1, Peter Harremoës authored at least 55 papers between 2001 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Reverse Information Projections and Optimal E-Statistics.
IEEE Trans. Inf. Theory, November, 2024

2023
Rate Distortion Theory for Descriptive Statistics.
Entropy, March, 2023

Universal Reverse Information Projections and Optimal E-statistics.
Proceedings of the IEEE International Symposium on Information Theory, 2023

2022
Unnormalized Measures in Information Theory.
CoRR, 2022

2020
Bounds on the information divergence for hypergeometric distributions.
Kybernetika, 2020

2019
Replication Papers.
Publ., 2019

The Rate Distortion Test of Normality.
Proceedings of the IEEE International Symposium on Information Theory, 2019

2018
Entropy Inequalities for Lattices.
Entropy, 2018

Statistical Inference and Exact Saddle Point Approximations.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

2017
Divergence and Sufficiency for Convex Optimization.
Entropy, 2017

Quantum information on spectral sets.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

2016
Bounds on tail probabilities for negative binomial distributions.
Kybernetika, 2016

Maximum Entropy and Sufficiency.
CoRR, 2016

Sufficiency on the Stock Market.
CoRR, 2016

2015
Lattices with non-Shannon inequalities.
Proceedings of the IEEE International Symposium on Information Theory, 2015

2014
Rényi Divergence and Kullback-Leibler Divergence.
IEEE Trans. Inf. Theory, 2014

Minimum KL-Divergence on Complements of $L_{1}$ Balls.
IEEE Trans. Inf. Theory, 2014

Mutual information of contingency tables and related inequalities.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

2013
Extendable MDL.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Horizon-Independent Optimal Prediction with Log-Loss in Exponential Families.
Proceedings of the COLT 2013, 2013

2012
Information Divergence is more chi squared distributed than the chi squared statistics
CoRR, 2012

Information divergence is more χ<sup>2</sup>-distributed than the χ<sup>2</sup>-statistics.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

2011
On Pairs of f -Divergences and Their Joint Range.
IEEE Trans. Inf. Theory, 2011

Lower bounds on Information Divergence
CoRR, 2011

2010
Thinning, entropy, and the law of thin numbers.
IEEE Trans. Inf. Theory, 2010

Misleading Reference.
Entropy, 2010

Rényi Divergence and Its Properties
CoRR, 2010

Joint range of f-divergences.
Proceedings of the IEEE International Symposium on Information Theory, 2010

Rényi divergence and majorization.
Proceedings of the IEEE International Symposium on Information Theory, 2010

2009
Joint Range of Rényi Entropies.
Kybernetika, 2009

Maximum Entropy on Compact Groups.
Entropy, 2009

<i>Entropy </i>- New Editor-in-Chief and Outlook.
Entropy, 2009

Regret and Jeffreys Integrals in Exp. Families
CoRR, 2009

Testing goodness-of-fit via rate distortion.
Proceedings of the 2009 IEEE Information Theory Workshop, 2009

Finiteness of redundancy, regret, Shtarkov sums, and Jeffreys integrals in exponential families.
Proceedings of the IEEE International Symposium on Information Theory, 2009

2008
On the Bahadur-Efficient Testing of Uniformity by Means of the Entropy.
IEEE Trans. Inf. Theory, 2008

<i>Entropy</i> 2008, <i>10</i>, 240-247: Ferri <i>et al.</i> Deformed Generalization of the Semiclassical Entropy.
Entropy, 2008

Efficiency of entropy testing.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

Thinning and information projections.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

2007
Entropy Testing is Efficient.
Proceedings of the IEEE International Symposium on Information Theory, 2007

The Information Bottleneck Revisited or How to Choose a Good Distortion Measure.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Thinning and the Law of Small Numbers.
Proceedings of the IEEE International Symposium on Information Theory, 2007

2006
Rényi Entropies of Projections.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

2005
Entropy and the law of small numbers.
IEEE Trans. Inf. Theory, 2005

Zipf's law, hyperbolic distributions and entropy loss.
Electron. Notes Discret. Math., 2005

Lower Bounds for Divergence in Central Limit Theorem.
Electron. Notes Discret. Math., 2005

Maximum entropy and the Edgeworth expansion.
Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, 2005

Martingales and information divergence.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

2004
Rate of Convergence to Poisson Law in Terms of Information Divergence.
IEEE Trans. Inf. Theory, 2004

The weak information projection.
Proceedings of the 2004 IEEE International Symposium on Information Theory, 2004

2003
Refinements of Pinsker's inequality.
IEEE Trans. Inf. Theory, 2003

A Nash Equilibrium related to the Poisson Channel.
Commun. Inf. Syst., 2003

2001
Inequalities between entropy and index of coincidence derived from information diagrams.
IEEE Trans. Inf. Theory, 2001

Binomial and Poisson distributions as maximum entropy distributions.
IEEE Trans. Inf. Theory, 2001

Maximum Entropy Fundamentals.
Entropy, 2001


  Loading...