Verkäufer
Second Story Books, ABAA, Rockville, MD, USA
Verkäuferbewertung 4 von 5 Sternen
Heritage Bookseller
AbeBooks-Mitglied seit 1996
Octavo; Fair+; Paperback; Spine, green with black print; Cover has slight edgewear, puckering to top spine corner, else clean and bright; Text block has faint moisture stain and puckering to top spine corner throughout, else clean and tight; ix, 93 pages, illustrated (b&w diagrams). 1351585. FP New Rockville Stock. Bestandsnummer des Verkäufers 1351585
Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several non-parametric algorithms have been proposed to estimate information measures.Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence.Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.
Reseña del editor: Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several non-parametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.
Titel: UNIVERSAL ESTIMATION OF INFORMATION MEASURES...
Verlag: NOW the Essence of Knowledge, Hanover
Erscheinungsdatum: 2009
Einband: Softcover
Anbieter: Phatpocket Limited, Waltham Abbey, HERTS, Vereinigtes Königreich
Zustand: Like New. Used - Like New. Book is new and unread but may have minor shelf wear. Your purchase helps support Sri Lankan Children's Charity 'The Rainbow Centre'. Our donations to The Rainbow Centre have helped provide an education and a safe haven to hundreds of children who live in appalling conditions. Artikel-Nr. Z1-J-018-01066
Anzahl: 1 verfügbar