神经网络课件神经网络课52教案

上传人:E**** 文档编号:91093056 上传时间:2019-06-21 格式:PPT 页数:29 大小:219KB
返回 下载 相关 举报
神经网络课件神经网络课52教案_第1页
第1页 / 共29页
神经网络课件神经网络课52教案_第2页
第2页 / 共29页
神经网络课件神经网络课52教案_第3页
第3页 / 共29页
神经网络课件神经网络课52教案_第4页
第4页 / 共29页
神经网络课件神经网络课52教案_第5页
第5页 / 共29页
点击查看更多>>
资源描述

《神经网络课件神经网络课52教案》由会员分享,可在线阅读,更多相关《神经网络课件神经网络课52教案(29页珍藏版)》请在金锄头文库上搜索。

1、5.2 BAM Network,Bidirectional Associative Memory,1,2,n,1,2,m,x1,x2,xn,y1,y2,ym,Layer A,W1,Layer B,W2,x(t)=fy(fx(x(t-1)W1)W2) y(t)=fx(fy(y(t-1)W2)W1),Calculating process,Assuming: x-1,+1N, y -1,+1M,(xi , yi), i=1,2,P,Learning rule,Definition of energy function:,wi is the ith row weight factors in the

2、 W wj is the jth column weight factors in the W,If xi0 then ywiT0 If xi0 then ywiT0 Therefore, E0,Hebb learning formular,W1=W2T,Example 1,x1(1 0 1 0 1) , y1(1 1 1 1) x2(1 0 1 0 0) , y2(0 1 1 0) x3(0 1 0 1 1) , y3(1 0 0 1),Change the vectors to -1,+1,x1(1 -1 1 -1 1) , y1(1 1 1 1) x2(1 -1 1 -1 -1) , y

3、2(-1 1 1 -1) x3(-1 1 -1 1 1) , y3(1 -1 -1 1),sgn(x1W)=sgn(1 5 5 1)=y1 sgn(y1WT)=sgn(4 4 4 4 4) =(1 1 1 1 1),change to binary sgn(y1WT)=(1 0 1 0 1)=x1,=(0 1 0 0 0) x1+ =(1 1 1 0 1) sgn(x1+ )W=sgn2 2 2 2 =y1,Chapter 6 ANN Hybrid System,6.1 Fuzzy Neural Network 6.2 Evolutionary Computation and Genetic

4、Algorithm 6.3 A Hybrid System of Expert Systems and Neural Networks,6.1 Fuzzy Neural Network,Neural-fuzzy networks implements fuzz-logic inferencing through neural networks.,1. Fuzzy-logic Systems,Fuzzy logic grew out of a desire to quantify rule-based systems. It provides a way to quantify certain

5、quantifiers such as approximately, often, rarely, several, few, and very.,Relationship of fuzzy-logic systems to the two main areas of AI,A. Fuzzy set,quantify the degree of membership with values between 0 (not a member) and 1 (definitely a member).,The following figure shows a representation of en

6、ergy requirement in fuzzy terms.,B. Conversion Between Numeric and Fuzzy-Logic Variables,How to convert a numeric variable to a fuzzy-logic variable through fuzzifier, and to convert the fuzzy-logic variable back to a numeric variable through a defuzzifier.,The numeric variable are denoted numeric(x

7、) and the fuzzy-logic variables are denoted fuzzy(e.g., 50 to 70 for low,and 80 to 100 for moderate).,numeric(60)=fuzzy(0, 1, 0, 0, 0) numeric(90)=fuzzy(0, 0, 1, 0, 0),For numeric values in the transition regions, we use a linear interpolation between the beginning and ending values of the region.,S

8、imilarly, we can convert the fuzzy-logic variable back to a numeric variable using the exact opposite process.,x1=77 x2=77,C. Union and Intersection of Fuzzy sets,Define two fuzzy sets: I=i1/x1, i2/x2, ,in/xn, J=j1/x1, j2/x2, ,jp/xp,where x1,x2,are members of the set with degrees of membership i1,i2

9、,(for set I) and j1,j2,(for set J).,Union of two fuzzy sets:,Intersection of two fuzzy sets:,D. Fuzzy Control,Fuzzy-logic variable (e,u),Control rule table,R=(NBe*PBu)+ (NSe*PSu)+ (0e*0u)+ (PSe*NSu)+ (PBe*NBu),u=e R,2. Neural-Fuzzy Networks,A. Neural-Fuzzy Networks for Expert Systems,B. The architec

10、ture of the neural-fuzzy controller,3. Fuzzy Neural Network,Hidden layer 1:,represents a radially symmetric (Gaussian) membership functions of A1(i=1 to N) and Bj(j=1 to M) for the normalized input variables x1 and x2, respectively.,is the center and is the standard derviation of a Gaussian membersh

11、ip function.,Hidden layer 2:,A common representation of the Gaussian membership function stored in hidden layer 1 of a fuzzy network.,Taking the dot product of the outputs of the membership functions(hidden layer 1) O1Ai and O1Bi.,For example, the output of the first node(R11) in the hidden layer 2

12、reflects whether the rule of A1(x1 is negative large) and B1(x2 is negative large) should be applied.,Hidden layer 3:,operates the same as a standard backpropagation network, having weight factors (wij) and a sigmoid transfer function. The weight factors are adjusted to map the firing strength of th

13、e fuzzy rules onto the desired output variable.,Hidden layer 4:,operate the same as a standard backpropagation network, as in layer 3.,6.2 Evolutionary Computation and Genetic Algorithm,GA basic operation,Step 1: Initiation,Step 2: Selection (reproduction),e.g. using fitness function,Step 3: Crossov

14、er,Example 1,A B,C D,Example 2,A: 0 0 1 1 1 1 B: 1 1 1 1 0 0 Shield 0 1 0 1 0 1,A: 0 1 1 1 1 0 B: 1 0 1 1 0 1,Step 4 : Mutation,01 10,Practice,Max f(x)=x2 , x=0 31,1. Initiation ( binary code),2. Selection(reproduction),第一代群体的选择(复制),169/293=0.58,169/1170=0.14,3. Crossover,复制交叉后的各项数据,平均值 293 439 最大值

15、576 729,4. Mutation,取变异概率0.001, 种群共有4*5=20位,20*0.001=0.02位,所以本例无串位值的改变。至此,完成一代遗传。,Application in ANN,用遗传算法学习神经网络权值wij,1. 二进制编码,W字符串表示的值和实际数值间的关系:,如w在10间,4位二进制:,W学习过程:交叉、变异,产生下一代网络。,2. Real number code,* code (0.4, -0.3, 2.1, 1.3, 0.9, -0.6, 4.5, -0.1, 0.7),* estimate function f=1/ei2,* Initiation de

16、termine the weight factors at random,(0.4, -0.3, 2.1, 1.3, 0.9, -0.6, 4.5, -0.1, 0.7) (0.7, -0.9, 1.2, 0.8, 1.4, 0.1, -1.1, 0.2, -1.1),* crossover (0.4, -0.9, 1.2, 1.3, 1.4, 0.1, 4.5, 0.2, -1.1),* mutation (0.4, -0.3, 2.1, 1.3, 0.9, -0.6, 4.5, -0.1, 0.7),(0.4, -0.3, 1.0, 1.3, 0.9, -0.6, 4.5, -0.8, 0.7),stochastic value -1.1,stochastic value -0.7,6.3 A Hybrid System of Expert Systems

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 高等教育 > 大学课件

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号