Analytic Information Theory
Through information theory, problems of communication and compression can be precisely modeled, formulated, and analyzed, and this information can be transformed by means of algorithms. Also, learning can be viewed as compression with side information. Aimed at students and researchers, this book addresses data compression and redundancy within existing methods and central topics in theoretical data compression, demonstrating how to use tools from analytic combinatorics to discover and analyze precise behavior of source codes. It shows that to present better learnable or extractable information in its shortest description, one must understand what the information is, and then algorithmically extract it in its most compact form via an efficient compression algorithm. Part I covers fixed-to-variable codes such as Shannon and Huffman codes, variable-to-fixed codes such as Tunstall and Khodak codes, and variable-to-variable Khodak codes for known sources. Part II discusses universal source coding for memoryless, Markov, and renewal sources.
- Explores analytic tools of analytic combinatorics, including Mellin transforms, saddle point methods, and other complex asymptotic methods
- Introduces readers to some problems of source coding, and teaches them how to find precise analyses of Huffman and Shannon codes, Tunstall and Khodak codes, and non-prefix codes
- Teaches readers how to apply tools of analytic combinatorics to problems of information theory by combining information theory and learning theory
- Addresses topics of interest to graduate students and researchers in computer science and electrical engineering working on the problems of designing and analyzing methods of data compression
Reviews & endorsements
'Drmota & Szpankowski's book presents an exciting and very timely review of the theory of lossless data compression, from one of the modern points of view. Their development draws interesting connections with learning theory, and it is based on a collection of powerful analytical techniques.' Ioannis Kontoyiannis, University of Cambridge
'Drmota and Szpankowski, leading experts in the mathematical analysis of discrete structures, present here a compelling treatment unifying modern and classical results in information theory and analytic combinatorics. This book is certain to be a standard reference for years to come.' Robert Sedgewick, Princeton University
Product details
No date availableHardback
9781108474443
400 pages
261 × 185 × 27 mm
0.904kg
Table of Contents
- Part I. Known Sources:
- 1. Preliminaries
- 2. Shannon and Huffman FV codes
- 3. Tunstall and Khodak VF codes
- 4. Divide-and-conquer VF codes
- 5. Khodak VV codes
- 6. Non-prefix one-to-one codes
- 7. Advanced data structures: tree compression
- 8. Graph and structure compression
- Part II. Universal Codes:
- 9. Minimax redundancy and regret
- 10. Redundancy of universal memoryless sources
- 11. Markov types and redundancy for Markov sources
- 12. Non-Markovian sources: redundancy of renewal processes
- A. Probability
- B. Generating functions
- C. Complex asymptotics
- D. Mellin transform and Tauberian theorems
- E. Exponential sums and uniform distribution mod 1
- F. Diophantine approximation
- References
- Index.