Mokshay M. Madiman

Orcid: 0000-0002-2992-1829

According to our database1, Mokshay M. Madiman authored at least 71 papers between 2004 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of two.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Volumes of Subset Minkowski Sums and the Lyusternik Region.
Discret. Comput. Geom., April, 2024

The entropic doubling constant and robustness of Gaussian codebooks for additive-noise channels.
CoRR, 2024

2023
A Closed-Form EVSI Expression for a Multinomial Data-Generating Process.
Decis. Anal., March, 2023

Submodular Function Inequalities Indexed by Chordal Graphs.
Proceedings of the IEEE International Symposium on Information Theory, 2023

2022
The Differential Entropy of Mixtures: New Bounds and Applications.
IEEE Trans. Inf. Theory, 2022

2021
Sharp Moment-Entropy Inequalities and Capacity Bounds for Symmetric Log-Concave Distributions.
IEEE Trans. Inf. Theory, 2021

Entropy Inequalities for Sums in Prime Cyclic Groups.
SIAM J. Discret. Math., 2021

Bernoulli sums and Rényi entropy inequalities.
CoRR, 2021

2020
Conditional Rényi Entropy and the Relationships between Rényi Capacities.
Entropy, 2020

Usable deviation bounds for the information content of convex measures.
Proceedings of the IEEE International Symposium on Information Theory, 2020

2019
Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region.
IEEE Trans. Inf. Theory, 2019

Majorization and Rényi entropy inequalities via Sperner theory.
Discret. Math., 2019

A Combinatorial Approach to Small Ball Inequalities for Sums and Differences.
Comb. Probab. Comput., 2019

Two remarks on generalized entropy power inequalities.
CoRR, 2019

On the question of the best additive noise among symmetric log-concave noises.
Proceedings of the IEEE International Symposium on Information Theory, 2019

Remarks on Rényi versions of conditional entropy and mutual information.
Proceedings of the IEEE International Symposium on Information Theory, 2019

Relationships between certain f -divergences.
Proceedings of the 57th Annual Allerton Conference on Communication, 2019

2018
Entropy Bounds on Abelian Groups and the Ruzsa Divergence.
IEEE Trans. Inf. Theory, 2018

Entropy versus variance for symmetric log-concave random variables and related problems.
CoRR, 2018

An Exact Upper Bound on the L<sup>p</sup> Lebesgue Constant and The ∞-Rényi Entropy Power Inequality for Integer Valued Random Variables.
CoRR, 2018

Design of Discrete Constellations for Peak-Power-Limited complex Gaussian Channels.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

2017
Entropies of Weighted Sums in Cyclic Groups and an Application to Polar Codes.
Entropy, 2017

Discrete Entropy Power Inequalities via Sperner Theory.
CoRR, 2017

Infinity-Rényi entropy power inequalities.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

A min-entropy power inequality for groups.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

2016
The norm of the Fourier transform on compact or discrete abelian groups.
CoRR, 2016

Forward and Reverse Entropy Power Inequalities in Convex Geometry.
CoRR, 2016

Reverse entropy power inequalities for s-concave densities.
Proceedings of the IEEE International Symposium on Information Theory, 2016

Information concentration for convex measures.
Proceedings of the IEEE International Symposium on Information Theory, 2016

2015
The Ruzsa divergence for random elements in locally compact abelian groups.
CoRR, 2015

Optimal Concentration of Information Content For Log-Concave Densities.
CoRR, 2015

Entropies of weighted sums in cyclic groups and applications to polar codes.
CoRR, 2015

The norm of the Fourier series operator.
Proceedings of the IEEE International Symposium on Information Theory, 2015

A discrete entropy power inequality for uniform distributions.
Proceedings of the IEEE International Symposium on Information Theory, 2015

Extracting semantic information without linguistic cues from generic sentences.
Proceedings of the 53rd Annual Allerton Conference on Communication, 2015

2014
Beyond the Entropy Power Inequality, via Rearrangements.
IEEE Trans. Inf. Theory, 2014

Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information.
IEEE Trans. Inf. Theory, 2014

A lower bound on the Rényi entropy of convolutions in the integers.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

2013
Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures.
Discret. Appl. Math., 2013

The entropy of sums and Rusza's divergence on abelian groups.
Proceedings of the 2013 IEEE Information Theory Workshop, 2013

Unfolding the entropy power inequality.
Proceedings of the 2013 Information Theory and Applications Workshop, 2013

A new approach to the entropy power inequality, via rearrangements.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

2012
Entropy and set cardinality inequalities for partition-determined functions.
Random Struct. Algorithms, 2012

Sumset inequalities for differential entropy and mutual information.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

An equipartition property for high-dimensional log-concave distributions.
Proceedings of the 50th Annual Allerton Conference on Communication, 2012

2011
The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions.
IEEE Trans. Inf. Theory, 2011

Dimensional behaviour of entropy and information
CoRR, 2011

On the problem of reversibility of the entropy power inequality.
CoRR, 2011

2010
Information inequalities for joint distributions, with interpretations and applications.
IEEE Trans. Inf. Theory, 2010

Fractional generalizations of Young and Brunn-Minkowski inequalities
CoRR, 2010

Compound Poisson Approximation via Information Functionals
CoRR, 2010

Patterns and exchangeability.
Proceedings of the IEEE International Symposium on Information Theory, 2010

The entropies of the sum and the difference of two IID random variables are not too different.
Proceedings of the IEEE International Symposium on Information Theory, 2010

Entropy and the hyperplane conjecture in convex geometry.
Proceedings of the IEEE International Symposium on Information Theory, 2010

Redundancy of exchangeable estimators.
Proceedings of the 48th Annual Allerton Conference on Communication, 2010

Fundamental limits for distributed estimation using a sensor field.
Proceedings of the 48th Annual Allerton Conference on Communication, 2010

2009
Entropy and set cardinality inequalities for partition-determined functions, with applications to sumsets
CoRR, 2009

A model for pricing data bundles based on minimax risks for estimation of a location parameter.
Proceedings of the 2009 IEEE Information Theory Workshop, 2009

The entropy power of a sum is fractionally superadditive.
Proceedings of the IEEE International Symposium on Information Theory, 2009

A criterion for the compound poisson distribution to be maximum entropy.
Proceedings of the IEEE International Symposium on Information Theory, 2009

2008
Cores of Cooperative Games in Information Theory.
EURASIP J. Wirel. Commun. Netw., 2008

On the entropy and log-concavity of compound Poisson measures
CoRR, 2008

On the entropy of sums.
Proceedings of the 2008 IEEE Information Theory Workshop, 2008

Playing games: A fresh look at rate and capacity regions.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

2007
Generalized Entropy Power Inequalities and Monotonicity Properties of Information.
IEEE Trans. Inf. Theory, 2007

Sandwich bounds for joint entropy.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Fisher Information, Compound Poisson Approximation, and the Poisson Channel.
Proceedings of the IEEE International Symposium on Information Theory, 2007

2006
The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

2005
Concentration and relative entropy for compound Poisson distributions.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

2004
Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration.
Proceedings of the 2004 IEEE Information Theory Workshop, 2004

Minimum description length vs. maximum likelihood in lossy data compression.
Proceedings of the 2004 IEEE International Symposium on Information Theory, 2004


  Loading...