Elements of Information Theory

Front Cover
John Wiley & Sons, Jul 11, 2006 - Computers - 776 pages
18 Reviews
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.

  

What people are saying - Write a review

User ratings

5 stars
14
4 stars
3
3 stars
1
2 stars
0
1 star
0

User Review - Flag as inappropriate

Read it.

Review: Elements of Information Theory

User Review  - Jason Yang - Goodreads

Cover and Thomas is THE classic information theory textbook. Here, the authors took on the ambitious task of making a comprehensive survey of (the still evolving) information theory. Admittedly, I got ... Read full review

Contents

1 Introduction and Preview
1
2 Entropy Relative Entropy and Mutual Information
13
3 Asymptotic Equipartition Property
57
4 Entropy Rates of a Stochastic Process
71
5 Data Compression
103
6 Gambling and Data Compression
159
7 Channel Capacity
183
8 Differential Entropy
243
12 Maximum Entropy
409
13 Universal Source Coding
427
14 Kolmogorov Complexity
463
15 Network Information Theory
509
16 Information Theory and Portfolio Theory
613
17 Inequalities in Information Theory
657
Bibliography
689
List of Symbols
723

9 Gaussian Channel
261
10 Rate Distortion Theory
301
11 Information Theory and Statistics
347

Common terms and phrases

References to this book

All Book Search results »

About the author (2006)

THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.

JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.

Bibliographic information