Notations-1
To keep consistency for the readers and my sanity, this article will be used as a reference to introduce notations for the following articles: Self Information Entropy Joint, Conditional an...
To keep consistency for the readers and my sanity, this article will be used as a reference to introduce notations for the following articles: Self Information Entropy Joint, Conditional an...
To understand the concept of mutual information first, we need to understand joint entropy, conditional entropy and marginal entropy. These concepts are not necessarily hard, but they are introduce...
Self-Information to Entropy In a previous post about self-information, I explained how to mathematically formalize the concept of information associated with a specific event. One key thing about ...
Quantifying information The idea of quantifying information was first introduced by Claude E. Shanon [1] in his historical paper “A Mathematical Theory of Communication”. The basic idea was quite ...