信息论基础 [兼容模式]

上传人:xzh****18 文档编号:44614680 上传时间:2018-06-14 格式:PDF 页数:24 大小:5MB
返回 下载 相关 举报
信息论基础 [兼容模式]_第1页
第1页 / 共24页
信息论基础 [兼容模式]_第2页
第2页 / 共24页
信息论基础 [兼容模式]_第3页
第3页 / 共24页
信息论基础 [兼容模式]_第4页
第4页 / 共24页
信息论基础 [兼容模式]_第5页
第5页 / 共24页
点击查看更多>>
资源描述

《信息论基础 [兼容模式]》由会员分享,可在线阅读,更多相关《信息论基础 [兼容模式](24页珍藏版)》请在金锄头文库上搜索。

1、2014-8-311Information theoryY. DINGEmail: Email: Email: Email: TelTelTelTel:87112490871124908711249087112490Reference book:Principles of digital communicationsAuthors:Suili FENG et alPublishing house:PHEI2About the course:3?Material ?Energy ?InformationAbout the course:Chapter 4. Fundamentals of in

2、formation theory?Measure of information (entropy) ?Discrete channel and capacity ?Continuous source, channel and capacity ?Source coding; ?Rate distortion theory4.1 Introduction2014-8-312?Message v. s. Information(1)Message can be, but not limited to: symbols, letters, numbers, speeches, images etc.

3、 (2) Message may contain information, or no information. (e.g. SMS message, multi-media message )(3) Amount of information: amount of uncertainty reduced by the reception of the message.(4)Purpose of communication: information transmission(5) Milestone of information theory: “A Mathematical Theory o

4、f Communication” by Claude Elwood Shannon, 1948.第4章 信息论基础4.2 Measure of information (entropy)?Measure of information(1) Amount of information=amount of uncertaintyreduced by the reception of message;Uncertainty ? likelihood? probability Amount of information: a function of probability/probabilities

5、? (relationship with probability?)(2)Different messages may be different in amount of information.Measurement: should be additive in amount of informationAmount of information: a function of probability/probabilities第4章 信息论基础Measurement of discrete sourceDescription of statistical discrete source wi

6、th N possible symbols:( )11=Niixp( )( )()()NN xpxpxpXpxxxX.:.:2121第4章 信息论基础BPSK,QPSK,16QAM ?Measurement of discrete sourceAmount of Information : a function of probability:ifandare statistically independent, satisfies the additivity property:definewe have( )( )iixpfxI=ixjx()()( )( )( )( )jijijijixpf

7、xpfxpxpfxxpfxxI+=( )( )( )1logii iI xfp xp x=()()( )()( )()11loglogijijij ijI x xfp x xfp xp xp xp x=+第4章 信息论基础( )iI x()ijI x xMeasurement of discrete sourceDefinition: the amount of information carried by message x x x xi i i i:base 2 log: bitnatural log:nitbase 10 log: hart( )( )( )1loglogii iI P

8、xP xP x= 第4章 信息论基础2014-8-313Measurement of discrete sourceExample:Source x follows the distribution as:four possible symbols are statistically independent, calculate the amount of information contained in the sequence S=“113200”.( )81414183:3210: XpX( )( )( ) ( ) ( ) ( ) ( ) ( )( )( )( )( )( )( ) ()

9、bitsppppppppppppSpSI83.111415. 11415. 1232201log01log21log31log11log11log0023111log1log=+=+=第4章 信息论基础Information contained in one QPSK/16QAM symbol??Entropy: average amount of informationEntropy of discrete sourceDefinition: the entropy of sourceis defined as Physical significance: average amount of

10、 information contained in one symbol.NixXi,.,2 , 1,:=( )( )( )iNiixpxpXHlog1=第4章 信息论基础Statistical expectation?Entropy of discrete sourceExample:calculate the entropy of source X( )81414183:3210:XpX()( )( )()4133111111logloglogloglog884444881.906iiiH Xp xp x= = +=bits symbol第4章 信息论基础?Entropy of discr

11、ete sourceExample:assumethatthesymbolsabovearestatistically independent, calculated the information contained in the following sequence.201 020 130 213 001 203 210 100 321 010 023 102 002 10 312 032 100 120 210 (1)Exact computation (2)approximation using entropySolution 1: exact computation based on

12、 probability Solution 2: approximation using entropy( )()413111log23log14log13log7log8448107.55iiiInp x= = +=bits()()()()4123 14 1371.096108.62iiIn H X=+=bits第4章 信息论基础?Maximum entropy theoremDefinition: convex setfor we haveDefinition: for 型凸函数型凸函数型凸函数型凸函数(下凸函数下凸函数下凸函数下凸函数, convex function) 型凸函数型凸函数

13、型凸函数型凸函数(上凸函数上凸函数上凸函数上凸函数, concave function)(凸凸凸凸/凹函数凹函数凹函数凹函数?)()1,2,.,iiin ixxxx=?()1,2,.,jjjn jxxxx=? nxR?01() 1ijxxx+?()()( ) ()()11ijijfxxf xf x+?()()( ) ()()11ijijfxxf xf x+? ,ijx xx? ? nxR?01第4章 信息论基础?Maximum entropy theoremConvex function has minimaConcave function has maxima ()2f x()121xx+(

14、) ()()121f xf x+()1f x1x()()121fxx+2xx( )f x第4章 信息论基础Example: Concave function2014-8-314?Maximum entropy theoremis a concave function, probability vector meets , we have:Using the above conclusion, we have the following theorem :Theorem: entropy is a concave function of vector()H X( )()()()12,.,Np x

15、p xp x()12,.,Npp pp=?( )f x()( )11NNiiiiiifp xp f x=第4章 信息论基础Q:is concave, when does takes the maximum value?()H X11Niip=11Niip=()H X?Maximum entropy theoremTheorem: Entropy takes the maximum if x follows equal probability distribution:Namely:equally distributed source has greatest uncertainty.( )()()= =NiNNNNNNNHxpxpxpH1211log1

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 行业资料 > 其它行业文档

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号