Information Theory, 9781108832908
Hardcover
Unlock secrets of information: from Shannon to machine learning mastery.

Information Theory

from coding to learning

$175.14

  • Hardcover

    748 pages

  • Release Date

    2 January 2025

Check Delivery Options

Summary

Mastering Information Theory: From Shannon to Machine Learning

This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study.

Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-p…

Book Details

ISBN-13:9781108832908
ISBN-10:1108832903
Author:Yury Polyanskiy, Yihong Wu
Publisher:Cambridge University Press
Imprint:Cambridge University Press
Format:Hardcover
Number of Pages:748
Release Date:2 January 2025
Weight:1.78kg
Dimensions:261mm x 182mm x 43mm
What They're Saying

Critics Review

‘Polyanskiy and Wu’s book treats information theory and various subjects of statistics in a unique ensemble, a striking novelty in the literature. It develops in depth the connections between the two fields, which helps to presenting the theory in a more complete, elegant and transparent way. An exciting and inspiring read for graduate students and researchers.’ Alexandre Tsybakov, CREST-ENSAE, Paris‘Since the publication of Claude E. Shannon’s A Mathematical Theory of Communication in 1948, information theory has expanded beyond its original focus on reliable transmission and storage of information to applications in statistics, machine learning, computer science, and beyond. This textbook, written by two leading researchers at the intersection of these fields, offers a modern synthesis of both the classical subject matter and these recent developments. It is bound to become a classic reference.’ Maxim Raginsky, University of Illinois, Urbana-Champaign‘The central role of information theory in data science and machine learning is highlighted in this book, and will be of interest to all researchers in these areas. The authors are two of the leading young information theorists currently active. Their deep understanding of the area is evident in the technical depth of the treatment, which also covers many communication theory-oriented aspects of information theory.’ Venkat Anantharam, University of California, Berkeley‘Written in a mathematically rigorous yet accessible style, this book offers information-theoretic tools that are indispensable for high-dimensional statistics. It also presents the classic topic of coding theorems in the modern one-shot (finite block-length) approach. To put it briefly, this is the information theory textbook of the new era.’ Shun Watanabe, Tokyo University of Agriculture and Technology

About The Author

Yury Polyanskiy

Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.

Yihong Wu is a Professor of Statistics and Data Science at Yale University, focusing on the theoretical and algorithmic aspects of high-dimensional statistics, information theory, and optimization. He is the recipient of the 2018 Sloan Research Fellowship in Mathematics.

Returns

This item is eligible for free returns within 30 days of delivery. See our returns policy for further details.