Universal Estimation of Information Measures for Analog Sources

Universal Estimation of Information Measures for Analog Sources

Versandkostenfrei!
Versandfertig in 1-2 Wochen
89,99 €
inkl. MwSt.
PAYBACK Punkte
45 °P sammeln!
Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several non...