Elements of Information TheoryThe latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thoughtprovoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upperlevel undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department. 
What people are saying  Write a review
User ratings
5 stars 
 
4 stars 
 
3 stars 
 
2 stars 
 
1 star 

User Review  Flag as inappropriate
Read it.
Review: Elements of Information Theory
User Review  Jason Yang  GoodreadsCover and Thomas is THE classic information theory textbook. Here, the authors took on the ambitious task of making a comprehensive survey of (the still evolving) information theory. Admittedly, I got ... Read full review
Contents
1 Introduction and Preview  1 
2 Entropy Relative Entropy and Mutual Information  13 
3 Asymptotic Equipartition Property  57 
4 Entropy Rates of a Stochastic Process  71 
5 Data Compression  103 
6 Gambling and Data Compression  159 
7 Channel Capacity  183 
8 Differential Entropy  243 
12 Maximum Entropy  409 
13 Universal Source Coding  427 
14 Kolmogorov Complexity  463 
15 Network Information Theory  509 
16 Information Theory and Portfolio Theory  613 
17 Inequalities in Information Theory  657 
Bibliography  689 
List of Symbols  723 
Common terms and phrases
achievable algorithm alphabet asymptotic average binary symmetric channel bits broadcast channel calculate capacity region channel capacity codebook codeword lengths coding theorem conditional Consider convex corresponding data compression defined Definition denote density describe differential entropy discrete memoryless encoding entropy rate equal ergodic estimate example feedback Figure Find ﬁrst Fisher information Gaussian channel given growth rate Hence Huffman code IEEE Trans independent information theory input joint distribution jointly typical Kolmogorov complexity Kraft inequality large numbers Lemma Let X1 logoptimal portfolio lower bound Markov chain matrix maximizing maximum entropy minimization multipleaccess channel mutual information node noise optimal code output probability mass function probability of error problem proof prove random variable rate distortion function receiver relative entropy satisfying sender Shannon side information source coding stochastic process string symbols T. M. Cover typical set uniquely decodable vector wealth