Добро пожаловать! Вы можете войти или зарегистрироваться
Обратная связь
Visa MasterCard WebMoney Яндекс.Деньги PayPal
+7 (495) 638-5305
+7 (812) 380-5006
Мой регион
Kullback–Leibler Divergence






Frederic P. Miller, Agnes F. Vandome, John McBrewster

Kullback–Leibler Divergence

бумажная книга

     
(0 голосов )

Аннотация к книге "Kullback–Leibler Divergence"

High Quality Content by WIKIPEDIA articles! In probability theory and information theory, the Kullback–Leibler divergenc (also information divergence, information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data,...