Sprache: Englisch
Verlag: Cambridge University Press, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: Books From California, Simi Valley, CA, USA
hardcover. Zustand: Good.
Sprache: Englisch
Verlag: Cambridge University Press, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: Books From California, Simi Valley, CA, USA
hardcover. Zustand: Fine.
Sprache: Englisch
Verlag: Cambridge University Press, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: Books From California, Simi Valley, CA, USA
hardcover. Zustand: Very Good.
Sprache: Englisch
Verlag: Cambridge University Press, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich
EUR 90,08
Anzahl: Mehr als 20 verfügbar
In den WarenkorbZustand: New. In.
Sprache: Englisch
Verlag: Cambridge University Press, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: Kennys Bookstore, Olney, MD, USA
Zustand: New. 2024. hardcover. . . . . . Books ship from the US and Ireland.
Buch. Zustand: Neu. Information Theory | From Coding to Learning | Yury Polyanskiy (u. a.) | Buch | Englisch | 2025 | Cambridge University Pr. | EAN 9781108832908 | Verantwortliche Person für die EU: Libri GmbH, Europaallee 1, 36244 Bad Hersfeld, gpsr[at]libri[dot]de | Anbieter: preigu.
Sprache: Englisch
Verlag: Cambridge University Press, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: Revaluation Books, Exeter, Vereinigtes Königreich
EUR 143,85
Anzahl: 2 verfügbar
In den WarenkorbHardcover. Zustand: Brand New. 550 pages. 10.00x1.60x9.80 inches. In Stock.
Sprache: Englisch
Verlag: Cambridge University Pr. Jan 2025, 2025
ISBN 10: 1108832903 ISBN 13: 9781108832908
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.