Information TheoryExcellent introduction treats three major areas: analysis of channel models and proof of coding theorems; study of specific coding systems; and study of statistical properties of information sources. Appendix summarizes Hilbert space background and results from the theory of stochastic processes. Advanced undergraduate to graduate level. Bibliography. |
Contents
I | 1 |
II | 5 |
III | 12 |
IV | 16 |
V | 21 |
VI | 24 |
VII | 27 |
VIII | 28 |
XXXII | 134 |
XXXIII | 138 |
XXXIV | 147 |
XXXV | 156 |
XXXVI | 161 |
XXXVII | 163 |
XXXVIII | 169 |
XXXIX | 172 |
IX | 33 |
X | 35 |
XI | 36 |
XII | 40 |
XIII | 43 |
XIV | 46 |
XV | 49 |
XVI | 53 |
XVII | 60 |
XVIII | 63 |
XIX | 77 |
XX | 80 |
XXI | 83 |
XXII | 87 |
XXIII | 89 |
XXIV | 91 |
XXV | 95 |
XXVI | 105 |
XXVII | 110 |
XXVIII | 113 |
XXIX | 124 |
XXX | 126 |
XXXI | 127 |
Other editions - View all
Common terms and phrases
a₁ alphabet assume average probability B₁ binary matrix binary sequences binary symmetric channel channel capacity check digits code word code-word length column converges corrector corresponding coset cyclic code decision scheme decoding set define discrete memoryless channel e-tuple eigenvalue equality equations error pattern example finite follows function Gaussian channel given Hamming bound hence information source input distribution input sequence instantaneous code L₂ Lemma linear linearly independent log p(y Markov chain Markov source memoryless channel minimal polynomial modulo n-sequences nonzero P{X₁ P₁ parity check code parity check matrix probability of error problem prove random variables real number received sequence result S₁ S₂ sequence of codes sequences of length stationary distribution steady state probabilities symbols theory time-discrete transmission rate transmitted uncertainty uniquely decipherable vector w₁ W₂ word length words of length X₁ Y₁ Y₂ zero