The Theory of Information and Coding
This is a revised edition of McEliece's classic. It is a self-contained introduction to all basic results in the theory of information and coding (invented by Claude Shannon in 1948). This theory was developed to deal with the fundamental problem of communication, that of reproducing at one point, either exactly or approximately, a message selected at another point. There is a short and elementary overview introducing the reader to the concept of coding. Then, following the main results, the channel and source coding theorems, there is a study of specific coding schemes which can be used for channel and source coding. This volume can be used either for self-study, or for a graduate/undergraduate level course at university. It includes dozens of worked examples and several hundred problems for solution. The exposition will be easily comprehensible to readers with some prior knowledge of probability and linear algebra.
- New printing of now classic text
- Many worked examples
- Hundreds of exercises
Reviews & endorsements
'… [An] outstanding book …'. Albert A. Mullin, Zentralblatt MATH
Product details
April 2002Hardback
9780521000956
410 pages
236 × 158 × 27 mm
0.684kg
108 b/w illus. 15 tables
Available
Table of Contents
- 1. Entropy and mutual information
- 2. Discrete memoryless channels and their capacity-cost functions
- 3. Discrete memoryless sources and their rate-distortion functions
- 4. The Gaussian channel and source
- 5. The source-channel coding theorem
- 6. Survey of advanced topics for part I
- 7. Linear codes
- 8. BCH Goppa, and related codes
- 9. Convolutional codes
- 10. Variable-length source coding
- 11. Survey of advanced topics for part II.