# An entropy inequality for symmetric random variables.

We establish a lower bound on the entropy of weighted sums of (possibly dependent) random variables $(X_1, X_2, \dots, X_n)$ possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of $(X_1, X_2, \dots, X_n)$. We show that for $n \geq 3$, the lower bound is tight if and only if $X_i

Publisher URL: http://arxiv.org/abs/1801.03868

DOI: arXiv:1801.03868v1

Researcher is an app designed by academics, for academics. Create a personalised feed in two minutes.

Choose from over 15,000 academics journals covering ten research areas then let Researcher deliver you papers tailored to your interests each day.

Researcher displays publicly available abstracts and doesn’t host any full article content. If the content is open access, we will direct clicks from the abstracts to the publisher website and display the PDF copy on our platform. Clicks to view the full text will be directed to the publisher website, where only users with subscriptions or access through their institution are able to view the full article.