《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码

上传人:E**** 文档编号:89400527 上传时间:2019-05-24 格式:PPT 页数:93 大小:2.63MB
返回 下载 相关 举报
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码_第1页
第1页 / 共93页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码_第2页
第2页 / 共93页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码_第3页
第3页 / 共93页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码_第4页
第4页 / 共93页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码_第5页
第5页 / 共93页
点击查看更多>>
资源描述

《《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码》由会员分享,可在线阅读,更多相关《《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第7章 信道编码(93页珍藏版)》请在金锄头文库上搜索。

1、Chapter 7 Channel Coding Theory,7.1 The characteristic of continuous source 7.2 The channel capacity of continuous source 7.3 Error Control and the fundamental of Channel encoding and decoding 7.4 Linear Block Code 7.5 Convolutional Code,7.1 The characteristic of continuous source,7.1.1 Continuous s

2、ource 7.1.2 The entropy of continuous source 7.1.3 The maximum entropy of continuous source,7.1 The characteristic of continuous source,7.1.1 Continuous source In practical, the output of source is usually continuous signal, such as voice signal, television picture signal. Because they are continuou

3、s and random, the source is called continuous source, and the message source output can describe with stochastic process. For one continuous source , when a specific moment given, the value it takes is continuous, that means the time and amplitude are all continuous function.,7.1.2 The entropy of co

4、ntinuous source,The simplest continuous source can be described with one-dimension random variables. Random variable exists non-negative function ( ). And it satisfies Then it is thought that has a continuous distribution, or is a continuous variable. is the probability density function, is the prob

5、ability distribution function.,Continuous variable satisfies: (1) (2) (3) is a monotone non-decreasing function. (4) is continuous from its left, that is. (5),Definition 7.1.1 For continuous source, if its probability density is, the entropy of the source is,7.1.3 The maximum entropy of continuous s

6、ource,Theorem 7.1.1 For source which is in its uniform probability distribution has a maximum output entropy. Proof: under the constraint condition to calculate the when,reaches its maximum. Let , calculate its partial derivative and let it be zero, then After the simplification,Solve the equation t

7、o get Because , there is . Then,The definition of Channel Coding A way of encoding data in a communications channel that adds patterns of redundancy into the transmission path in order to lower the error rate. Such methods are widely used in wireless communications.,7.2 The channel capacity of conti

8、nuous source,7.2.1 Random coding 7.2.2 Coding theorem,7.2 The channel capacity of continuous source,7.2.1 Random coding For random coding, there are choices when choose messages groups corresponding to the set M (code sets) from points of the N-dimensional vector space the proportion of set M accoun

9、t for the total points of the vector space :,7.2.1 Random coding,If set M is chosen from code-candidate at random, the average Error Probability is:,7.2.1 Random coding,To assume a code word of cord set from : turn to a received code word through the DMC channel,7.2.1 Random coding,Show function,Gal

10、lagher point the upper limit of the bit error:,7.2.2 Coding theorem The average probability of error:,7.2.2 Coding theorem,Look at the above formulary, in this formulary the code word probability is the product of its each symbol. The upper limit of concerns only with the channel but not the method

11、of encoding.,7.2.2 Coding theorem,Encoding rate:,Defined function:,Reliability function:,Conclusion :,7.2.2 Coding theorem,Explains: These three figures show the relationships of the parameter in the channel respectively. If is settled, the relationship of E(R) and R is:,7.2.2 Coding theorem,Noisy C

12、hannel Coding Theorem : If the transmission rate R is less than C, then for any 0 there exists a code with block length n large enough whose error probability is less than .,7.2.2 Coding theorem,Converse to the Noisy Channel Coding Theorem: If RC, the probability of an error in a decoded block must

13、approach one regardless of the code that might be chosen. These two theorems always appear together and are called: Noisy Channel Coding Theorem,7.3 Error Control and the fundamental of Channel encoding and decoding,7.3.1 The Error Control Method 7.3.2 code distance, error corrected and checked 7.3.

14、3 Optimal decoding and maximum likelihood decoding,7.3 Error Control and the fundamental of Channel encoding and decoding,7.3.1 The Error Control Method (1)Method 1: For the same bit rate, the channel capacity is large as well as the reliability function E (R); For the same channel capacity, when th

15、e rate decreases The reliability function E (R) increased.,This figure illustrates the method of increasing E(R),7.3.1 The Error Control Method,The following measures can be taken to reduce the probability of error: (1).increase the channel capacity: Extend the bandwidth; Increase the power; Reduce

16、noise.,7.3.1 The Error Control Method,(2)Reduce the bit rate R: q and N are unchanged but decrease k, which means that reduce the rate of information source and it transmits less information each second; q and k are unchanged but increase N, which means that increase symbol rate (baud rate ) and occupy more bandwidth ; k and N are unchanged but decrease N, which

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 高等教育 > 大学课件

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号