《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量

上传人:E**** 文档编号:89400512 上传时间:2019-05-24 格式:PPT 页数:79 大小:601KB
返回 下载 相关 举报
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量_第1页
第1页 / 共79页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量_第2页
第2页 / 共79页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量_第3页
第3页 / 共79页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量_第4页
第4页 / 共79页
《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量_第5页
第5页 / 共79页
点击查看更多>>
资源描述

《《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量》由会员分享,可在线阅读,更多相关《《Information Theory & Coding信息论与编码(英文版)》-梁建武-电子教案 第2章 信息度量(79页珍藏版)》请在金锄头文库上搜索。

1、Chapter 2.Basic Concepts of Info. Theory-Information statistical measure,Introductionpreparation knowledge 1.Information measure,The information is measurable which is the foundation to establish Info. Theory; Methods of information measure :Structure measure , statistical measure , semantic measure

2、 , fuzzy measure and so on;,Statistical measure: this method counts with logarithm of the probability of the event to describe the uncertain thing and to obtain the info. content of the message, it also establishes a new concept -entropy ; Entropy is the most important concept in Shannon Info.Theory

3、.,2.Single mark discrete source mathematical model,Since the discrete source only involves a random event, it can be expressed by a discrete random variable. Single mark discrete mathematical model X,Y,Z are the random variables, refer to whole source ; is certain result of a random event or some el

4、ement of the source. cannot confuse!,3 R e v i e w,4.Mathematics knowledge,Log(xy)=logx+logy Log(x/y)=logx-logy,2.1 Self Info. And Conditional Self-Info.,2.1.1 Self-Info. Self-Info.,Information I(ai) of ai must be function of ais uncertainty such as P(ai) It can be expression as I(ai) =fP(ai) How ab

5、out I(ai) =P(ai) ? Not suit for the 4th axiom If P(ai), ; If P(ai)=0, I(ai)=; If P(ai)=1, I(ai)=0; If P(a1) and P(a2) are independent then I(a1a2)= I(a1)+ I(a2),Regarding as a single random variable of the message U, if some message appear, the corresponding probability is , by now the information a

6、vailable was , then had:,Note:ISelf-Info.,Explanations: one will inevitably surprised when small probability event appears, therefore the information produced is rich; If the impossible event once to appear nearly, it will be an explosive news, amazing the world with a single brilliant feat. An even

7、t that most probably happened is during the peoples expectation. Even if it occurs, it doesnt has any information . Especially when the inevitable event has occurred, it cannot give us any information .,Nature of the self-Info. I(ai) I(ai) is a non- negative value; When P(ai) =1, I(ai)=0; When P(ai)

8、 =0, I(ai)= ; I(ai) is the monotone decreasing function of P(ai) Union self-Info. The source model (involves two random events) Union self-Info,Example2.12(6),2.1.2 Conditional Self-Info. negative value of logarithm of the conditional probability the information quantity which brought by the random

9、event occurs under the specific condition( has been decided) definition,Union self-Info. quantity and conditional self-Info. quantity are also non- negative and the monotone decreasing. Relationship: when X and Y are independent,,2.2Mutual info. and conditional mutual info. The probability of the me

10、ssage send by source is called pre-test probability. The probability extrapolated by the receiver after he had receive the source is called after-test probability. Define the logarithm of the ratio of after-test probability and pre-test probability as the mutual info. : that is:,mutual info. is equa

11、l to self-info. minus conditional info. The third kind of expression:,Nature of mutual info. Symmetry When X and Y are independent, the mutual info. is 0. The mutual info. may be a positive or negative value Conditional mutual info. It is mutual info. of and under given condition . The definition is

12、:,Question and Ponder,Question:,Source info. from the weather of February in one area: Now if somebody tells you today is not sunny ,and you take this as message . Thus, the probability of each kind of weather occurs becomes after-test probability. Such as:,Calculate the mutual info. quantity betwee

13、n and all kinds of weathers,信息量,X2、x3、x4各1比特的信息量,也可以理解为y1使X2、x3、x4不确定度各减少1比特,说明收到y1后,不仅没有使x1的不确定度减少,反而使x1不确定更大,互信息量为负,Example 2.2,Review of Probability,2.3 Source entropy 2.3.1 Introduction of entropy,a random variable X may has N possible values with different probability,What Info. Theory concern:

14、the indefinite of X. The greater of its indefinite, the more of info. can be gained.,Introduction of entropy,Analyses of the indefinite of getting red balls within the 100 balls in the box: Random variables : X、Y、Z,Question:measurable? How to measure?,low,high,99 red,1 black,50 red,50 black,20 red,2

15、0 of each of other four colors,2.3.2 mathematics description of source entropy source entropy Def. :Mathematic expectation of the self-info. of each discrete message from the source (namely probability weighting average value). It is the entropy of the source, called info. entropy, also called sourc

16、e entropy or the Shannon entropy, sometimes, also called unconditional entropy or entropy function. The abbreviation is entropy. Formula:,The variable in the entropy function is X, which refer to the source entity. Actually it is the uncertain measure of the source indefinite. And it is also the entropy after the experiment. Unit: take 2 as the bottom, bit/symbol Why to use the word “

展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 高等教育 > 大学课件

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号