国际知名大学原版教材:信息论基础

List of Figures
1 Introduction and Preview
1.1 Preview of the book
2 Entropy, Relative Entropy and Muteal Information
2.1 Entropy
2.2 Joint entropy and conditional entropy
2.3 Relative entropy and mutual information
2.4 Relatiohip between entropy and mutual information
2.5 Chain rules for entropy, relative entropy and mutual information
2.6 Jeen's inequality and its coequences
2.7 The log sum inequality and its applicatio
2.8 Data processing inequality
2.9 The second law of thermodynamics
2.10 Sufficient statistics
2.11 Fano's inequality
Summary of Chapter 2
Problems for Chapter 2
Historical notes
3 The Asymptotic Equipartition Property
3.1 The AEP
3.2 Coequences of the AEP: data compression
3.3 High probability sets and the typical set
Summary of Chapter 3
Problems for Chapter 3
Historical notes
4 Entropy Rates of a Stochastic Process
4.1 Markov chai
4.2 Entropy rate
4.3 Example: Entropy rate of a random walk on a weighted graph
4.4 Hidden Markov models
Summary of Chapter 4
Problems for Chapter 4
Historical notes
5 Data Compression
5.1 Examples of codes
5.2 Kraft inequality
5.3 Optimal codes
5.4 Bounds on the optimal codelength
5.5 Kraft inequality for uniquely decodable codes
5.6 Huffman codes
5.7 Some comments on Huffnan codes
5.8 Optimality of Huffman codes
5.9 Shannon-Fano-Elias coding
5.10 Arithmetic coding
5.11 Competitive optimality of the Shannon code
5.12 Generation of discrete distributio from fair
coi
Summary of Chapter 5
Problems for Chapter 5
Historical notes
6 Gambling and Data Compression
6.1 The hoe race
6.2 Gambling and side information
6.3 Dependent hoe races and entropy rate
6.4 The entropy of English
6.5 Data compression and gambling
6.6 Gambling estimate of the entropy of English
Summary of Chapter 6
Problems for Chapter 6
Historical notes
7 Kolmogorov Complexity
7.1 Models of computation
7.2 Kolmogorov complexity: definitio and examples
7.3 Kolmogorov complexity and entropy
7.4 Kolmogorov complexity of intege
7.5 Algorithmically random and incompressible
sequences
7.6 Univeal probability
7.7 The halting problem and the non-computability of
Kolmogorov complexity
7.8 Ω
7.9 Univeal gambling
7.10 Occam's razor
7.11 Kolmogorov complexity and univeal probability
7.12 The Kolmogorov sufficient statistic
Summary of Chapter 7
Problems for Chapter 7
Historical notes
8 Channel Capacity
8.1 Examples of channel capacity
8.2 Symmetric channels
8.3 Properties of channel capacity
8.4 Preview of the channel coding theorem
8.5 Definitio
8.6 Jointly typical sequences
8.7 The channel coding theorem
8.8 Zero-error codes
8.9 Fano's inequality and the convee to the coding theorem
8.10 Equality in the convee to the channel coding theorem
8.11 Hamming codes
8.12 Feedback capacity
8.13 The joint source channel coding theorem
Summary of Chapter 8
Problems for Chapter 8
Historical notes !
9 Differential Entropy
9.1 Definitio
9.2 The AEP for continuous random variables
9.3 Relation of differential entropy to discrete entropy
9.4 Joint and conditional differential entropy
9.5 Relative entropy and mutual information
9.6 Properties of differential entropy, relative entropy and mutual information
9.7 Differential entropy bound on discrete entropy
Summary of Chapter 9
Problems for Chapter 9
Historical notes
10 The Gaussian Channel
10.1 The Gaussian channel:.definitio
10.2 Convee to the coding theorem for Gaussian channels
10.3 Band-limited channels
10.4 Parallel Gaussian channels
10.5 Channels with Colored Gaussian noise
10.6 Gaussian channels with feedback
Summary of Chapter 10
Problems for Chapter 10
Historical notes
11 Mnximum Entropy and Spectral Estimation
11.1 Maximum entropy distributio
11.2 Examples
11.3 An anomalous maximum entropy problem
11.4 Spectrum estimation
11.5 Entropy rates of a Gaussian process
11.6 Burg's maximum entropy theorem
Summary of Chapter 11
Problems for Chapter 11
Historical notes
12 Information Theory and Statistics
12.1 The method of types
12.2 The law of large numbe
12.3 Univeal source coding
12.4 Large deviation theory
12.5 Examples of Sanov's theorem
12.6 The conditional limit theorem
12.7 Hypothesis testing
12.8 Stein's lemma
12.9 Chernoff bound
12.10 Lempel-Ziv coding
12.11 Fisher information and the Cramer-Rao inequality
Summary of Chapter 12
Problems for Chapter 12
Historical notes
13 Rate Distortion Theory
13.1 Quantization
13.2 Definitio
13.3 Calculation of the rate distortion function
13.4 Convee to the rate distortion theorem
13.5 Achievability of the rate distortion function
13.6 Strongly typical sequences and rate distortion
13.7 Characterization of the rate distortion function
13.8 Computation of channel capacity and the rate distortion function
Summary of Chapter 13
Problems for Chapter 13
Historical notes
14 Network Information Theory
14.1 Gaussian multiple user channels
14.2 Jointly typical sequences
14.3 The multiple access channel
14.4 Encoding of correlated sources
14.5 Duality between Slepian-Wolf encoding and multiple access channels
14.6 The broadcast channel
14.7 The relay channel
14.8 Source coding with side information
14.9 Rate distortion with side information
14.10 General multiterminal networks
Summary of Chapter 14
Problems for Chapter 14
Historical notes
15 Information Theory and the Stock Market
15.1 The stock market: some definitio
15.2 Kuhn-Tucker characterization of the log-optimal portfolio
15.3 Asymptotic optimality of the log-optimal portfolio
15.4 Side information and the doubling rate
15.5 Investment in stationary markets
15.6 Competitive optimality of the log-optimal portfolio
15.7 The Shannon-McMillan-Breiman theorem
Summary of Chapter 15
Problems for Chapter 15
Historical notes
16 Inequalities in Information Theory
16.1 Basic inequalities of information theory
16.2 Differential entropy
16.3 Bounds on entropy and relative entropy
16.4 Inequalities for types
16.5 Entropy rates of subsets
16.6 Entropy and Fisher information
16.7 The entropy power inequality and the Brunn-Minkowski inequality
16.8 Inequalities for determinants
16.9 Inequalities for ratios of determinants
Overall Summary
Problems for Chapter 16
Historical notes
Bibliography
List of Symbols
Index
1 Introduction and Preview
1.1 Preview of the book
2 Entropy, Relative Entropy and Muteal Information
2.1 Entropy
2.2 Joint entropy and conditional entropy
2.3 Relative entropy and mutual information
2.4 Relatiohip between entropy and mutual information
2.5 Chain rules for entropy, relative entropy and mutual information
2.6 Jeen's inequality and its coequences
2.7 The log sum inequality and its applicatio
2.8 Data processing inequality
2.9 The second law of thermodynamics
2.10 Sufficient statistics
2.11 Fano's inequality
Summary of Chapter 2
Problems for Chapter 2
Historical notes
3 The Asymptotic Equipartition Property
3.1 The AEP
3.2 Coequences of the AEP: data compression
3.3 High probability sets and the typical set
Summary of Chapter 3
Problems for Chapter 3
Historical notes
4 Entropy Rates of a Stochastic Process
4.1 Markov chai
4.2 Entropy rate
4.3 Example: Entropy rate of a random walk on a weighted graph
4.4 Hidden Markov models
Summary of Chapter 4
Problems for Chapter 4
Historical notes
5 Data Compression
5.1 Examples of codes
5.2 Kraft inequality
5.3 Optimal codes
5.4 Bounds on the optimal codelength
5.5 Kraft inequality for uniquely decodable codes
5.6 Huffman codes
5.7 Some comments on Huffnan codes
5.8 Optimality of Huffman codes
5.9 Shannon-Fano-Elias coding
5.10 Arithmetic coding
5.11 Competitive optimality of the Shannon code
5.12 Generation of discrete distributio from fair
coi
Summary of Chapter 5
Problems for Chapter 5
Historical notes
6 Gambling and Data Compression
6.1 The hoe race
6.2 Gambling and side information
6.3 Dependent hoe races and entropy rate
6.4 The entropy of English
6.5 Data compression and gambling
6.6 Gambling estimate of the entropy of English
Summary of Chapter 6
Problems for Chapter 6
Historical notes
7 Kolmogorov Complexity
7.1 Models of computation
7.2 Kolmogorov complexity: definitio and examples
7.3 Kolmogorov complexity and entropy
7.4 Kolmogorov complexity of intege
7.5 Algorithmically random and incompressible
sequences
7.6 Univeal probability
7.7 The halting problem and the non-computability of
Kolmogorov complexity
7.8 Ω
7.9 Univeal gambling
7.10 Occam's razor
7.11 Kolmogorov complexity and univeal probability
7.12 The Kolmogorov sufficient statistic
Summary of Chapter 7
Problems for Chapter 7
Historical notes
8 Channel Capacity
8.1 Examples of channel capacity
8.2 Symmetric channels
8.3 Properties of channel capacity
8.4 Preview of the channel coding theorem
8.5 Definitio
8.6 Jointly typical sequences
8.7 The channel coding theorem
8.8 Zero-error codes
8.9 Fano's inequality and the convee to the coding theorem
8.10 Equality in the convee to the channel coding theorem
8.11 Hamming codes
8.12 Feedback capacity
8.13 The joint source channel coding theorem
Summary of Chapter 8
Problems for Chapter 8
Historical notes !
9 Differential Entropy
9.1 Definitio
9.2 The AEP for continuous random variables
9.3 Relation of differential entropy to discrete entropy
9.4 Joint and conditional differential entropy
9.5 Relative entropy and mutual information
9.6 Properties of differential entropy, relative entropy and mutual information
9.7 Differential entropy bound on discrete entropy
Summary of Chapter 9
Problems for Chapter 9
Historical notes
10 The Gaussian Channel
10.1 The Gaussian channel:.definitio
10.2 Convee to the coding theorem for Gaussian channels
10.3 Band-limited channels
10.4 Parallel Gaussian channels
10.5 Channels with Colored Gaussian noise
10.6 Gaussian channels with feedback
Summary of Chapter 10
Problems for Chapter 10
Historical notes
11 Mnximum Entropy and Spectral Estimation
11.1 Maximum entropy distributio
11.2 Examples
11.3 An anomalous maximum entropy problem
11.4 Spectrum estimation
11.5 Entropy rates of a Gaussian process
11.6 Burg's maximum entropy theorem
Summary of Chapter 11
Problems for Chapter 11
Historical notes
12 Information Theory and Statistics
12.1 The method of types
12.2 The law of large numbe
12.3 Univeal source coding
12.4 Large deviation theory
12.5 Examples of Sanov's theorem
12.6 The conditional limit theorem
12.7 Hypothesis testing
12.8 Stein's lemma
12.9 Chernoff bound
12.10 Lempel-Ziv coding
12.11 Fisher information and the Cramer-Rao inequality
Summary of Chapter 12
Problems for Chapter 12
Historical notes
13 Rate Distortion Theory
13.1 Quantization
13.2 Definitio
13.3 Calculation of the rate distortion function
13.4 Convee to the rate distortion theorem
13.5 Achievability of the rate distortion function
13.6 Strongly typical sequences and rate distortion
13.7 Characterization of the rate distortion function
13.8 Computation of channel capacity and the rate distortion function
Summary of Chapter 13
Problems for Chapter 13
Historical notes
14 Network Information Theory
14.1 Gaussian multiple user channels
14.2 Jointly typical sequences
14.3 The multiple access channel
14.4 Encoding of correlated sources
14.5 Duality between Slepian-Wolf encoding and multiple access channels
14.6 The broadcast channel
14.7 The relay channel
14.8 Source coding with side information
14.9 Rate distortion with side information
14.10 General multiterminal networks
Summary of Chapter 14
Problems for Chapter 14
Historical notes
15 Information Theory and the Stock Market
15.1 The stock market: some definitio
15.2 Kuhn-Tucker characterization of the log-optimal portfolio
15.3 Asymptotic optimality of the log-optimal portfolio
15.4 Side information and the doubling rate
15.5 Investment in stationary markets
15.6 Competitive optimality of the log-optimal portfolio
15.7 The Shannon-McMillan-Breiman theorem
Summary of Chapter 15
Problems for Chapter 15
Historical notes
16 Inequalities in Information Theory
16.1 Basic inequalities of information theory
16.2 Differential entropy
16.3 Bounds on entropy and relative entropy
16.4 Inequalities for types
16.5 Entropy rates of subsets
16.6 Entropy and Fisher information
16.7 The entropy power inequality and the Brunn-Minkowski inequality
16.8 Inequalities for determinants
16.9 Inequalities for ratios of determinants
Overall Summary
Problems for Chapter 16
Historical notes
Bibliography
List of Symbols
Index
Thomas M. Cover斯坦福大学电气工程系、统计系教授。曾任IEEE信息论学会主席,现任数理统计研究所研究员、IEEE高级会员。1972年以论文“Broadcast Channels”荣获信息论优秀论文奖,1990年被选为“Shannon Lecturer”,这是信息论领域的最高荣誉。最近20年,他致力于信息论和统计学之间的关系。
《信息论基础》系统介绍了信息论基本原理及其在通信理论、统计学、计算机科学、概率论以及投资理论等领域的应用。作者以循序渐进的方式,介绍了信息量的基本定义、相对熵、互信息以及他们如何自然地用来解决数据压缩、信道容量、信息率失真、统计假设、网络信息流等问题。除此以外,《信息论基础》还探讨了很多教材中从未涉及的问题。《信息论基础》可作为通信、电子、计算机、自动控制、统计、经济等专业高年级本科生和研究生的教材或参考书,也可供相关领域的科研人员和专业技术人员参考。
比价列表
公众号、微信群

微信公众号

实时获取购书优惠