{企业形象}决策层特征融合decisionlevelidentityfusion

上传人:精****库 文档编号:140667633 上传时间:2020-07-31 格式:PPTX 页数:44 大小:505.09KB
返回 下载 相关 举报
{企业形象}决策层特征融合decisionlevelidentityfusion_第1页
第1页 / 共44页
{企业形象}决策层特征融合decisionlevelidentityfusion_第2页
第2页 / 共44页
{企业形象}决策层特征融合decisionlevelidentityfusion_第3页
第3页 / 共44页
{企业形象}决策层特征融合decisionlevelidentityfusion_第4页
第4页 / 共44页
{企业形象}决策层特征融合decisionlevelidentityfusion_第5页
第5页 / 共44页
点击查看更多>>
资源描述

《{企业形象}决策层特征融合decisionlevelidentityfusion》由会员分享,可在线阅读,更多相关《{企业形象}决策层特征融合decisionlevelidentityfusion(44页珍藏版)》请在金锄头文库上搜索。

1、,Decision-Level Identity Fusion Tan Xin,Lab 5, System Engineering Dept.,Contents,1. Introduction 2. Classical inference 3. Bayesian inference 4. Dempster-Shafers method* 5. Generalized Evidence Processing (GEP) Theory 6. Heuristic methods for identity fusion 7. Implementation and trade-offs,Introduc

2、tion,Decision-level fusion,Seeks to process identity declarations from multiple sensors to achieve a joint declaration of identity.,Introduction,Sensor A,Sensor B,Sensor N,Feature Extraction,Identity Declaration,Identity Declaration,Identity Declaration,Association,Decision Level Fusion Identity Fus

3、ion,Introduction,Decision-Level Fusion Techniques,Classical inference Bayesian inference Dempster-Shafers method Generalized evidence processing theory Heuristic methods,Classical inference,Statistical inference techniques seek to draw conclusions about an underlying mechanism or distribution, based

4、 on an observed sample of data.,Classical inference typically assumes an empirical probability model. Empirical probability assumes that the observed frequency distribution will approximate the probability as the number of trials.,here,n trials, occurrence of k times,Theoretical base,Classical infer

5、ence,One disadvantage,Strictly speaking, empirical probabilities are only defined for repeatable events. Classical inference methods utilize empirical probability and hence are not strictly applicable to nonrepeatable events, unless some model can be developed to compute the requisite probabilities.

6、,Classical inference,Main technique hypothesis testing,Define two hypothesis 1. A null hypothesis, H0 (原假设) 2. An alternative hypothesis,H1 (备择假设),Test logic 1. Assume that the null hypothesis (H0) is true; 2. Examine the consequences of H0 being true in the sampling distribution for statistic; 3. P

7、erform a hypothesis test, if the observation have a high probability of being observed if H0 is true, the declare the data do not contradict H0. 4. Otherwise, declare that the data tend to contradict H0.,Classical inference,Main technique hypothesis testing Two assumptions are required 1. an exhaust

8、ive and mutually exclusive set of hypothesis can be defined 2.we can compute the probability of an observation, given an assumed hypothesis.,Classical inference,Generalize to include multidimensional data from multiple sensors. Requires a priori knowledge and computation of multidimensional probabil

9、ity density functions. (a serious disadvantage),Classical inference,Additional disadvantages 1. Only two hypotheses can be assessed at a time; 2. Complexities arise for multivariate data; 3. Do not take advantage of a priori likelihood assessment.,Usage: identification of defective parts in manufact

10、uring and analysis of faults in system diagnosis and maintenance.,Bayesian inference,Bayesian inference updates the likelihood of a hypothesis given a previous likelihood estimate and additional evidence (observations). The technique may be based on either classical probabilities, or subjective prob

11、abilities.,Subjective probabilities suffer a lack of mathematical rigor or physical interpretation. Nevertheless, if used with care, it can be useful in a data fusion inference processor.,Bayesian inference,Bayesian formulation,Suppose H1,H2,Hi, represent mutually exclusive and exhaustive hypotheses

12、,Bayesian inference,Features 1.provide a determination of the probability of a hypothesis being true, given the evidence. Classical inference give us the probability that an observation could be ascribed to an object or event, given an assumed hypothesis. 2.allow incorporation of a priori knowledge

13、about the likelihood of a hypothesis being true at all. 3.use subjective probabilities for a priori probabilities for hypothesis, and for the probability of evidence given a hypothesis.,Bayesian inference,Multisensor fusion For each sensor, a priori data provide an estimate of the probability that t

14、he sensor would declare the object to be type i given that the object to be of type j, noted as P(Di|Oj). These declarations are then combined via a generalization of Bayesian formulation described before. This provides an updated, joint probability for each possible entity Oj. Input to Bayes formul

15、ation: P(Di|Oj). for each sensor and entity or hypothesis Hi; P(Oj) a priori probabilities,Bayesian inference,Sensor #1 Observables Classifier Declaration,Sensor #2 ETC,Sensor #n ETC,P(D1|Oj),P(D2|Oj),P(Dn|Oj),Bayesian Combination Formula,Decision Logic: MAP Threshold MAP etc,D1,D2,Dn,Fused Indentit

16、y Declaration,Bayesian inference,Disadvantages 1.Difficulty in defining priori functions: P(Oj) 2.Complexity when there are multiple potential hypothesis and multiple conditionally dependent events 3.Requirements that competing hypothesis be mutually exclusive: cannot assign evidence to object Oi and Oj. 4.Lack of an ability to assign general uncertainty.,Bayesian inference,An IFFN Example,Identification-friend-foe-neutral system developed by Ferrante, Inc. of the U.K. This

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 商业/管理/HR > 企业文档

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号