经典与量子信息论(英文版)

目 录内容简介
Foreword
Introduction
1 Probability basics
1.1 Events, event space, and probabilities
1.2 Combinatorics
1.3 Combined, joint, and conditional probabilities
1.4 Exercises2 Probability distributions
2.1 Mean and variance
2.2 Exponential, Poisson, and binomial distributions
2.3 Continuous distributions
2.4 Uniform, exponential, and Gaussian (normal) distributions
2.5 Central-limit theorem
2.6 Exercises3 Measuring information
3.1 Making sense of information
3.2 Measuring information
3.3 Information bits
3.4 Renyi's fake coin
3.5 Exercises4 Entropy
4.1 From Boltzmann to Shannon
4.2 Entropy in dice
4.3 Language entropy
4.4 Maximum entropy (discrete source)
4.5 Exercises5 Mutual information and more entropies
5.1 Joint and conditional entropies
5.2 Mutual information
5.3 Relative entropy
5.4 Exercises6 Differential entropy
6.1 Entropy of continuous sources
6.2 Maximum entropy (continuous source)
6.3 Exercises7 Algorithmic entropy and Kolmogorov complexity
7.1 Defining algorithmic entropy
7.2 The Turing machine
7.3 Universal Turing machine
7.4 Kolmogorov complexity
7.5 Kolmogorov complexity vs. Shannon's entropy
7.6 Exercises8 Information coding
8.1 Coding numbers
8.2 Coding language
8.3 The Morse code
8.4 Mean code length and coding efficiency
8.5 Optimizing coding efficiency
8.6 Shannon's source-coding theorem
8.7 Exercises9 Optimal coding and compression
9.1 Huffman codes
9.2 Data compression
9.3 Block codes
9.4 Exercises10 Integer, arithmetic, and adaptive coding
10.1 Integer coding
10.2 Arithmetic coding
10.3 Adaptive Huffman coding
10.4 Lempel-Ziv coding
10.5 Exercises11 Error correction
11.1 Communication channel
11.2 Linear block codes
11.3 Cyclic codes
11.4 Error-correction code types
11.5 Corrected bit-error-rate
11.6 Exercises12 Channel entropy
12.1 Binary symmetric channel
12.2 Nonbinary and asymmetric discrete channels
12.3 Channel entropy and mutual information
12.4 Symbol error rate
12.5 Exercises13 Channel capacity and coding theorem
13.1 Channel capacity
13.2 Typical sequences and the typical set
13.3 Shannon's channel coding theorem
13.4 Exercises14 Gaussian channel and Shannon-Hartley theorem
14.1 Gaussian channel
14.2 Nonlinear channel
14.3 Exercises15 Reversible computation
15.1 Maxwell's demon and Landauer's principle
15.2 From computer architecture to logic gates
15.3 Reversible logic gates and computation
15.4 Exercises16 Quantum bits and quantum gates
16.1 Quantum bits
16.2 Basic computations with 1-qubit quantum gates
16.3 Quantum gates with multiple qubit inputs and outputs
16.4 Quantum circuits
16.5 Tensor products
16.6 Noncloning theorem
16.7 Exercises17 Quantum measurements
17.1 Dirac notation
17.2 Quantum measurements and types
17.3 Quantum measurements on joint states
17.4 Exercises18 Qubit measurements, superdense coding, and quantum teleportaUon
18.1 Measuring single qubits
18.2 Measuring n-qubits
18.3 Bell state measurement
18.4 Superdense coding
18.5 Quantum teleportation
18.6 Distributed quantum computing
18.7 Exercises19 Deutsch-Jozsa, quantum Fourier transform, and Grover quantum database
search algorithms
19.1 Deutsch algorithm
19.2 Deutsch-Jozsa algorithm
19.3 Quantum Fourier transform algorithm
19.4 Grover quantum database search algorithm
19.5 Exercises20 Shor's factorization algorithm
20.1 Phase estimation
20.2 Order finding
20.3 Continued fraction expansion
20.4 From order finding to factorization
20.5 Shor's factorization algorithm
20.6 Factorizing N = 15 and other nontrivial composites
20.7 Public-key cryptography
20.8 Exercises21 Quantum information theory
21.1 Von Neumann entropy
21.2 Relative, joint, and conditional entropy, and mutual information
21.3 Quantum communication channel and Holevo bound
21.4 Exercises
……
25 Classical and quantum cryptography
Introduction
1 Probability basics
1.1 Events, event space, and probabilities
1.2 Combinatorics
1.3 Combined, joint, and conditional probabilities
1.4 Exercises2 Probability distributions
2.1 Mean and variance
2.2 Exponential, Poisson, and binomial distributions
2.3 Continuous distributions
2.4 Uniform, exponential, and Gaussian (normal) distributions
2.5 Central-limit theorem
2.6 Exercises3 Measuring information
3.1 Making sense of information
3.2 Measuring information
3.3 Information bits
3.4 Renyi's fake coin
3.5 Exercises4 Entropy
4.1 From Boltzmann to Shannon
4.2 Entropy in dice
4.3 Language entropy
4.4 Maximum entropy (discrete source)
4.5 Exercises5 Mutual information and more entropies
5.1 Joint and conditional entropies
5.2 Mutual information
5.3 Relative entropy
5.4 Exercises6 Differential entropy
6.1 Entropy of continuous sources
6.2 Maximum entropy (continuous source)
6.3 Exercises7 Algorithmic entropy and Kolmogorov complexity
7.1 Defining algorithmic entropy
7.2 The Turing machine
7.3 Universal Turing machine
7.4 Kolmogorov complexity
7.5 Kolmogorov complexity vs. Shannon's entropy
7.6 Exercises8 Information coding
8.1 Coding numbers
8.2 Coding language
8.3 The Morse code
8.4 Mean code length and coding efficiency
8.5 Optimizing coding efficiency
8.6 Shannon's source-coding theorem
8.7 Exercises9 Optimal coding and compression
9.1 Huffman codes
9.2 Data compression
9.3 Block codes
9.4 Exercises10 Integer, arithmetic, and adaptive coding
10.1 Integer coding
10.2 Arithmetic coding
10.3 Adaptive Huffman coding
10.4 Lempel-Ziv coding
10.5 Exercises11 Error correction
11.1 Communication channel
11.2 Linear block codes
11.3 Cyclic codes
11.4 Error-correction code types
11.5 Corrected bit-error-rate
11.6 Exercises12 Channel entropy
12.1 Binary symmetric channel
12.2 Nonbinary and asymmetric discrete channels
12.3 Channel entropy and mutual information
12.4 Symbol error rate
12.5 Exercises13 Channel capacity and coding theorem
13.1 Channel capacity
13.2 Typical sequences and the typical set
13.3 Shannon's channel coding theorem
13.4 Exercises14 Gaussian channel and Shannon-Hartley theorem
14.1 Gaussian channel
14.2 Nonlinear channel
14.3 Exercises15 Reversible computation
15.1 Maxwell's demon and Landauer's principle
15.2 From computer architecture to logic gates
15.3 Reversible logic gates and computation
15.4 Exercises16 Quantum bits and quantum gates
16.1 Quantum bits
16.2 Basic computations with 1-qubit quantum gates
16.3 Quantum gates with multiple qubit inputs and outputs
16.4 Quantum circuits
16.5 Tensor products
16.6 Noncloning theorem
16.7 Exercises17 Quantum measurements
17.1 Dirac notation
17.2 Quantum measurements and types
17.3 Quantum measurements on joint states
17.4 Exercises18 Qubit measurements, superdense coding, and quantum teleportaUon
18.1 Measuring single qubits
18.2 Measuring n-qubits
18.3 Bell state measurement
18.4 Superdense coding
18.5 Quantum teleportation
18.6 Distributed quantum computing
18.7 Exercises19 Deutsch-Jozsa, quantum Fourier transform, and Grover quantum database
search algorithms
19.1 Deutsch algorithm
19.2 Deutsch-Jozsa algorithm
19.3 Quantum Fourier transform algorithm
19.4 Grover quantum database search algorithm
19.5 Exercises20 Shor's factorization algorithm
20.1 Phase estimation
20.2 Order finding
20.3 Continued fraction expansion
20.4 From order finding to factorization
20.5 Shor's factorization algorithm
20.6 Factorizing N = 15 and other nontrivial composites
20.7 Public-key cryptography
20.8 Exercises21 Quantum information theory
21.1 Von Neumann entropy
21.2 Relative, joint, and conditional entropy, and mutual information
21.3 Quantum communication channel and Holevo bound
21.4 Exercises
……
25 Classical and quantum cryptography
目 录内容简介
《经典与量子信息论(英文版)》完整地叙述了经典信息论和量子信息论,首先介绍了香农熵的基本概念和各种应用,然后介绍了量子信息和量子计算的核心特点。本书从经典信息论和量子信息论的角度,介绍了编码、压缩、纠错、加密和信道容量等内容,采用非正式但科学的精确方法,为读者提供r理解量子门和电路的知识。
本书自始至终都在向读者介绍重要的结论,而不是让读者迷失在数学推导的细节中,并且配有大量的实践案例和章后习题,适合电子、通信、计算机等专业的研究生和科研人员学习参考。
本书自始至终都在向读者介绍重要的结论,而不是让读者迷失在数学推导的细节中,并且配有大量的实践案例和章后习题,适合电子、通信、计算机等专业的研究生和科研人员学习参考。
比价列表
公众号、微信群

微信公众号

实时获取购书优惠