
编辑推荐
《工业和信息化部"十二五"规划教材:信息理论基础(英文版)》推荐:This book summarized the experience of the authors on teaching information course in English andChinese, together with their understandings from years scientific research practice.It can be used as teachingmaterial on information theory course for both international undergraduate and graduate students.
目录
Chapterl Introductjon
1.1 Concept of information
1.2 History of information theory
1.3 Information, messages and signals
1.4 Communication system model
1.5 Information theory applications
1.5.1 Electrical engineering (communication theory)
1.5.2 Computer science (algorithmic complexity)
Exercises
Chapter 2 Statistical Measure of Information
2.1 Information of random events
2.1.1 Self—information
2.1.2 Conditional self—information
2.1.3 Mutual information of events
2.2 Information of discrete random variables
2.2.1 Entropy of discrete random variables
2.2.2 Joint entropy
2.2.3 Conditional entropy
2.2.4 Mutual information of discrete random variables
2.3 Relationship between entropy and mutual information
2.4 Mutual information and entropy of continuous random variables
2.4.1 Mutual information of continuous random variables
2.4.2 Entropy of continuous random variables
Exercises
Chapter 3 Discrete Source and Its Entropy Rate
3.1 Mathematical model of source
3.1.1 Discrete source and continuous source
3.1.2 Simple discrete source and its extension
3.1.3 Memoryless source and source with memory
3.2 Discrete memoryless source
3.2.1 Definition
3.2.2 Extension of discrete source
3.3 Discrete stationary source
3.3.1 Definition
3.3.2 Entropy rate of discrete stationary source
3.4 Discrete Markov source
3.4.1 Markov chain
3.4.2 Transition probability
3.4.3 Markov source and its entropy rate
Exercises
Chapter 4 Lossless Source Coding and Data Compression
4.1 Asymptotic equipartition property and typical sequences
4.2 Lossless source coding
4.2.1 Encoder
4.2.2 Blockcode
4.2.3 Fixed length code
4.2.4 Variable length code
4.3 Data compression
4.3.1 Shannon coding
4.3.2 Huffman coding
4.3.3 Fano coding
Exercises
Chapter 5 Discrete Channel and Its Capacity
5.1 Mathematical model of channel
5.2 Discrete memoryless channel
5.2.1 Mathematical model of discrete memoryless channel
5.2.2 Simple DMC
5.2.3 Extension of discrete memoryless channel
5.3 Channel combination
5.4 Channel capacity
5.4.1 Concept of channel capacity
5.4.2 Channel capacity of several special discrete channels
5.4.3 Channel capacity of symmetric channels
5.4.4 Channel capacity of extended DMC
5.4.5 Channel capacity of independent parallel DMC
5.4.6 Channel capacity of the sum channel
5.4.7 Channel capacity of general discrete channels
Exercises
Chapter 6 Noisy—channel Coding
6.1 Probability of error
6.2 Decoding rules
6.3 Channel coding
6.3.1 Simple repetition code
6.3.2 Linear code
6.4 Noisy—channel coding theorem
Exercises
Chapter 7 Rate Distortion
7.1 Quantization
7.2 Distortion definition
7.2.1 Distortion function
7.2.2 Mean—distortion
7.3 Rate distortion function
7.3.1 Fidelity criterion for given channel
7.3.2 Definition of rate distortion function
7.3.3 Property of rate distortion function
7.4 Rate distortion theorem and the converse
7.5 The calculation of rate distortion function
Exercises
Chapter 8 Continuous Source and Its Entropy Rate
8.1 Continuous source
8.2 Entropy of continuous source
8.3 Maximum entropy of continuous source
8.4 Joint entropy, conditional entropy and mutual information for continuous
random variables
8.5 Entropy rate of continuous source
8.6 Rate distortion for continuous source
Exercises
Chapter 9 Continuous Channel and Its Capacity
9.1 Capacity of continuous channel
9.1.1 Capacity of discrete—time channel
9.1.2 Capacity of continuous—time channel
9.2 The Gaussian channel
9.3 Band—limited channels
9.4 Coding theorem for continuous channel
Exercises
Chapter 10 Maximum Entropy and Spectrum Estimation
10.1 Maximum entropy probability distribution
10.1.1 Maximum entropy distribution
10.1.2 Examples
10.2 Maximum entropy spectrum estimation
10.2.1 Burg's max entropy theorem
10.2.2 Maximum entropy spectrum estimation
Exercises
Chapter 11 Experiments of Information Theory
11.1 Measure of information
11.1.1 Information calculator
11.1.2 Properties of entropy
11.2 Simulation of Markov source
11.3 Performance simulation for source coding
11.3.1 Shannon coding
11.3.2 Huffman coding
11.3.3 Fano coding
11.4 Simulation of BSC
11.5 Simulation of the cascade channel
11.6 Calculation of channel capacity
11.7 Decoding rules
11.8 Performance demonstration of channel coding
References
文摘
版权页:
插图:
《工业和信息化部"十二五"规划教材:信息理论基础(英文版)》推荐:This book summarized the experience of the authors on teaching information course in English andChinese, together with their understandings from years scientific research practice.It can be used as teachingmaterial on information theory course for both international undergraduate and graduate students.
目录
Chapterl Introductjon
1.1 Concept of information
1.2 History of information theory
1.3 Information, messages and signals
1.4 Communication system model
1.5 Information theory applications
1.5.1 Electrical engineering (communication theory)
1.5.2 Computer science (algorithmic complexity)
Exercises
Chapter 2 Statistical Measure of Information
2.1 Information of random events
2.1.1 Self—information
2.1.2 Conditional self—information
2.1.3 Mutual information of events
2.2 Information of discrete random variables
2.2.1 Entropy of discrete random variables
2.2.2 Joint entropy
2.2.3 Conditional entropy
2.2.4 Mutual information of discrete random variables
2.3 Relationship between entropy and mutual information
2.4 Mutual information and entropy of continuous random variables
2.4.1 Mutual information of continuous random variables
2.4.2 Entropy of continuous random variables
Exercises
Chapter 3 Discrete Source and Its Entropy Rate
3.1 Mathematical model of source
3.1.1 Discrete source and continuous source
3.1.2 Simple discrete source and its extension
3.1.3 Memoryless source and source with memory
3.2 Discrete memoryless source
3.2.1 Definition
3.2.2 Extension of discrete source
3.3 Discrete stationary source
3.3.1 Definition
3.3.2 Entropy rate of discrete stationary source
3.4 Discrete Markov source
3.4.1 Markov chain
3.4.2 Transition probability
3.4.3 Markov source and its entropy rate
Exercises
Chapter 4 Lossless Source Coding and Data Compression
4.1 Asymptotic equipartition property and typical sequences
4.2 Lossless source coding
4.2.1 Encoder
4.2.2 Blockcode
4.2.3 Fixed length code
4.2.4 Variable length code
4.3 Data compression
4.3.1 Shannon coding
4.3.2 Huffman coding
4.3.3 Fano coding
Exercises
Chapter 5 Discrete Channel and Its Capacity
5.1 Mathematical model of channel
5.2 Discrete memoryless channel
5.2.1 Mathematical model of discrete memoryless channel
5.2.2 Simple DMC
5.2.3 Extension of discrete memoryless channel
5.3 Channel combination
5.4 Channel capacity
5.4.1 Concept of channel capacity
5.4.2 Channel capacity of several special discrete channels
5.4.3 Channel capacity of symmetric channels
5.4.4 Channel capacity of extended DMC
5.4.5 Channel capacity of independent parallel DMC
5.4.6 Channel capacity of the sum channel
5.4.7 Channel capacity of general discrete channels
Exercises
Chapter 6 Noisy—channel Coding
6.1 Probability of error
6.2 Decoding rules
6.3 Channel coding
6.3.1 Simple repetition code
6.3.2 Linear code
6.4 Noisy—channel coding theorem
Exercises
Chapter 7 Rate Distortion
7.1 Quantization
7.2 Distortion definition
7.2.1 Distortion function
7.2.2 Mean—distortion
7.3 Rate distortion function
7.3.1 Fidelity criterion for given channel
7.3.2 Definition of rate distortion function
7.3.3 Property of rate distortion function
7.4 Rate distortion theorem and the converse
7.5 The calculation of rate distortion function
Exercises
Chapter 8 Continuous Source and Its Entropy Rate
8.1 Continuous source
8.2 Entropy of continuous source
8.3 Maximum entropy of continuous source
8.4 Joint entropy, conditional entropy and mutual information for continuous
random variables
8.5 Entropy rate of continuous source
8.6 Rate distortion for continuous source
Exercises
Chapter 9 Continuous Channel and Its Capacity
9.1 Capacity of continuous channel
9.1.1 Capacity of discrete—time channel
9.1.2 Capacity of continuous—time channel
9.2 The Gaussian channel
9.3 Band—limited channels
9.4 Coding theorem for continuous channel
Exercises
Chapter 10 Maximum Entropy and Spectrum Estimation
10.1 Maximum entropy probability distribution
10.1.1 Maximum entropy distribution
10.1.2 Examples
10.2 Maximum entropy spectrum estimation
10.2.1 Burg's max entropy theorem
10.2.2 Maximum entropy spectrum estimation
Exercises
Chapter 11 Experiments of Information Theory
11.1 Measure of information
11.1.1 Information calculator
11.1.2 Properties of entropy
11.2 Simulation of Markov source
11.3 Performance simulation for source coding
11.3.1 Shannon coding
11.3.2 Huffman coding
11.3.3 Fano coding
11.4 Simulation of BSC
11.5 Simulation of the cascade channel
11.6 Calculation of channel capacity
11.7 Decoding rules
11.8 Performance demonstration of channel coding
References
文摘
版权页:
插图:
ISBN | 9787512419728 |
---|---|
出版社 | 北京航空航天大学出版社 |
作者 | 陈杰 |
尺寸 | 16 |