Self-Information
Lambert M. Surhone, Miriam T. Timpledon, Susan F. Marseken
High Quality Content by WIKIPEDIA articles! High Quality Content by WIKIPEDIA articles! In information theory (elaborated by Claude E. Shannon, 1948), self-information is a measure of the information content associated with the outcome of a random variable. It is expressed in a unit of information, for example bits, nats, or hartleys, depending on the base of the logarithm used in its calculation. The term self-information is also sometimes used as a synonym of entropy, i.e. the expected value of self-information in the first sense, because I(X;X) = H(X), where I(X;X) is the mutual information of X with itself. These two meanings are not equivalent, and this article covers the first sense only. For the other sense, see entropy. By...
ISBN: 978-6-1311-5568-0
Издательство:
Книга по требованию
Дата выхода: июль 2011