人工智能的幻灯片ch14-bayesian-network-part2

上传人:F****n 文档编号:88133913 上传时间:2019-04-19 格式:PPT 页数:38 大小:2.04MB
返回 下载 相关 举报
人工智能的幻灯片ch14-bayesian-network-part2_第1页
第1页 / 共38页
人工智能的幻灯片ch14-bayesian-network-part2_第2页
第2页 / 共38页
人工智能的幻灯片ch14-bayesian-network-part2_第3页
第3页 / 共38页
人工智能的幻灯片ch14-bayesian-network-part2_第4页
第4页 / 共38页
人工智能的幻灯片ch14-bayesian-network-part2_第5页
第5页 / 共38页
点击查看更多>>
资源描述

《人工智能的幻灯片ch14-bayesian-network-part2》由会员分享,可在线阅读,更多相关《人工智能的幻灯片ch14-bayesian-network-part2(38页珍藏版)》请在金锄头文库上搜索。

1、XIV. Bayesian networks (section 4-5),Autumn 2012 Instructor: Wang Xiaolong Harbin Institute of Technology, Shenzhen Graduate School Intelligent Computation Research Center (HITSGS ICRC),Outlines,Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochasti

2、c simulation Approximate inference by Markov chain Monte Carlo,Inference tasks,Simple queries: compute posterior marginal Conjunctive queries: Optimal decisions: decision networks include utility information; probabilistic inference required for Value of information: which evidence to seek next? Sen

3、sitivity analysis: which probability values are most critical? Explanation: why do I need a new starter motor?,Inference by enumeration,Slightly intelligent way to sum out variables from the joint without actually constructing its explicit representation Simple query on the burglary network: Rewrite

4、 full joint entries using product of CPT entries: Recursive depth-first enumeration: space, time,Enumeration algorithm,Evaluation tree,Enumeration is inefficient: repeated computation,Inference by variable elimination,Variable elimination: carry out summations right-to-left, storing intermediate res

5、ults (factors) to avoid recomputation,Variable elimination: Basic operations,Summing out a variable from a product of factors: move any constant factors outside the summation add up submatrices in pointwise product of remaining factors,Variable elimination algorithm,Irrelevant variables,Complexity o

6、f exact inference,Singly connected networks (or polytrees): any two nodes are connected by at most one (undirected) path time and space cost of variable elimination are,Inference by stochastic simulation,Basic idea: 1) Draw N samples from a sampling distribution S 2) Compute an approximate posterior

7、 probability 3) Show this converges to the true probability P Outline: Sampling from an empty network Rejection sampling: reject samples disagreeing with evidence Likelihood weighting: use evidence to weight samples Markov chain Monte Carlo (MCMC): sample from a stochastic process whose stationary d

8、istribution is the true posterior,Sampling from an empty network,Example,Example,Example,Example,Example,Example,Example,Sampling from an empty network contd.,Rejection sampling,Similar to a basic real-world empirical estimation procedure,Analysis of rejection sampling,Hence rejection sampling retur

9、ns consistent posterior estimates Problem: hopelessly expensive if P(e) is small P(e) drops off exponentially with number of evidence variables!,Likelihood weighting,Idea: fix evidence variables, sample only nonevidence variables, and weight each sample by the likelihood it accords the evidence,Like

10、lihood weighting example,Likelihood weighting example,Likelihood weighting example,Likelihood weighting example,Likelihood weighting example,Likelihood weighting example,Likelihood weighting example,Likelihood weighting analysis,Hence likelihood weighting returns consistent estimates but performance

11、 still degrades with many evidence variables because a few samples have nearly all the total weight,Approximate inference using MCMC,“State” of network = current assignment to all variables. Generate next state by sampling one variable given Markov blanket Sample each variable in turn, keeping evide

12、nce fixed Can also choose a variable to sample at random each time,The Markov chain,Wander about for a while, average what you see,MCMC example contd.,Theorem: chain approaches stationary distribution: long-run fraction of time spent in each state is exactly proportional to its posterior probability

13、,Markov blanket sampling,Summary,Exact inference by variable elimination: polytime on polytrees, NP-hard on general graphs space = time, very sensitive to topology Approximate inference by LW, MCMC: LW does poorly when there is lots of (downstream) evidence LW, MCMC generally insensitive to topology Convergence can be very slow with probabilities close to 1 or 0 Can handle arbitrary combinations of discrete and continuous variables,Assignments,Ex 14.3,

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 办公文档 > PPT模板库 > PPT素材/模板

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号