Information Theory

About The Book

This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression channel coding and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples students are introduced to contemporary applications in statistics machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science such as f-divergences PAC Bayes and variational principle Kolmogorov''s metric entropy strong data processing inequalities and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors and additional standalone chapters on more specialized topics in information theory this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering statistics and computer science.
Piracy-free
Piracy-free
Assured Quality
Assured Quality
Secure Transactions
Secure Transactions
Delivery Options
Please enter pincode to check delivery time.
*COD & Shipping Charges may apply on certain items.
Review final details at checkout.
downArrow

Details


LOOKING TO PLACE A BULK ORDER?CLICK HERE