Elements of Information Theory

Front Cover
Wiley, Aug 26, 1991 - Computers - 542 pages
Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems.

From inside the book

Contents

Entropy Relative Entropy and Mutual Information
10
The Asymptotic Equipartition Property 50
48
Entropy Rates of a Stochastic Process
60
Copyright

21 other sections not shown

Other editions - View all

Common terms and phrases