bp神经网络源代码(basic)

上传人:xiao****1972 文档编号:84136226 上传时间:2019-03-02 格式:DOC 页数:12 大小:45.50KB
返回 下载 相关 举报
bp神经网络源代码(basic)_第1页
第1页 / 共12页
bp神经网络源代码(basic)_第2页
第2页 / 共12页
bp神经网络源代码(basic)_第3页
第3页 / 共12页
bp神经网络源代码(basic)_第4页
第4页 / 共12页
bp神经网络源代码(basic)_第5页
第5页 / 共12页
点击查看更多>>
资源描述

《bp神经网络源代码(basic)》由会员分享,可在线阅读,更多相关《bp神经网络源代码(basic)(12页珍藏版)》请在金锄头文库上搜索。

1、Private Sub Command5_Click()Dim tempNum AsDoubleFactTimes = 0For loopt = 0 To times - 1 学习次数 FactTimes= FactTimes + 1tempNum = 0If FactTimes 2000 And FactTimes 5000 Then alpha= 0.4 beta= 0.4 Else alpha = 0.3 beta= 0.3 End IfEnd If开始神经网络计算/*-Beginning of neural computting-*/For loopl = 1 To LearnExam

2、pleNums 学习模式个数 /*forward computting */ /* inputlayer */ For i = 1To InputUnitNums OutofInputLayer(i) = inDatas(loopl, i) Next i /* hidelayer */ For i = 1To HideUnitNumsinival = CDbl(0) For j= 1 To InputUnitNumsinival = inival + w_InputHide(i, j) * OutofInputLayer(j) Nextjinival = inival + Cw_Hide(i)

3、OutofHideLayer(i) = Sigmf(inival) Next i /* outputlayer */ For i = 1To OutUnitNumsinival = 0# For j= 1 To HideUnitNums inival = inival + w_HideOut(i, j) *OutofHideLayer(j) Nextjinival = inival + Cw_Out(i)OutofOutLayer(i) = Sigmf(inival) Next i /*-Backpropagation-*/ /* deltacaclculate*/ Error = 0# Fo

4、r i = 1To OutUnitNumswk = OutofOutLayer(i)wkb = Teacher(loopl, i) - wk计算每个学习模式中各个输出结点的误差平方和Error = Error + wkb * wkbDEL_Out(i) = wkb * wk * (1# - wk) Next i /* deltacaclculate*/ For i = 1To HideUnitNumsinival = 0# Forj = 1 To OutUnitNumsinival = inival + (DEL_Out(j) * w_HideOut(j, i) Nextj wk =Outof

5、HideLayer(i)DEL_Hide(i) = inival * wk * (1# - wk) Next i /*updating for weights from Hide Layer */ For i = 1To OutUnitNumsDCw_Out(i) = alpha * DEL_Out(i) Forj = 1 To HideUnitNums Dw_HideOut(i, j) = alpha *DEL_Out(i) * OutofHideLayer(j)Next j Next i /*updating for weights from Input Layer */ For i= 1

6、 To HideUnitNumsDCw_Hide(i) = beta * DEL_Hide(i)For j = 1 To InputUnitNumsDw_InputHide(i, j) = beta * DEL_Hide(i) * OutofInputLayer(j)Next j Next i *input layer to hide layer For i= 1 To HideUnitNums wk = moment * OCw_Hide(i) +DCw_Hide(i)Cw_Hide(i) = Cw_Hide(i) + wkOCw_Hide(i) = wkFor j = 1 To Input

7、UnitNumswk = moment * Ow_InputHide(i, j) + Dw_InputHide(i, j) w_InputHide(i, j) =w_InputHide(i, j) + wkOw_InputHide(i, j) = wkNext j Next i *hide layer to output layer For i= 1 To OutUnitNumswk = moment * OCw_Out(i) + DCw_Out(i)Cw_Out(i) = Cw_Out(i) + wkOCw_Out(i) = wkFor j = 1 To HideUnitNumswk = m

8、oment * Ow_HideOut(i, j) + Dw_HideOut(i, j)w_HideOut(i, j) = w_HideOut(i, j) + wkOw_HideOut(i, j) = wkNext j Nexti 所有学习模式的误差总和tempNum = tempNum + Error Next loopl 如果达到了要求的误差范围,就可以退出循环 If(tempNum / 2) = ErLimit) Then ExitFor End IfNext looptList1.ClearFor i = 1 To OutUnitNums For j = iTo HideUnitNums

9、List1.AddItem w_HideOut( & i & , & j& )= & w_HideOut(i, j) Next jList1.AddItem OutofOutLayer( & i & ) & OutofOutLayer(i)Next iFor i = 1 To HideUnitNums For j = 1To InputUnitNumsList1.AddItem w_InputHide( & i & , & j & )= & w_InputHide(i, j) Next jList1.AddItem OutofHideLayer( & i & )=& OutofHideLaye

10、r(i)Next iList1.AddItem 全局误差= & Format$(tempNum/ 2, #.#,#)List1.AddItem 预测误差= &Format$(Sqr(tcmpNum / 2), #.#)List1.AddItem 循环次数= & FactTimescmdSave. Enabled= TruecmdNetCal.Enabled = FalseBeepvsFlexArray3.Rows = HideUnitNums + 1vsFlexArray3.Cols = InputUnitNums + 1vsFlexArray5.Rows = HideUnitNums + 1

11、For i = 1 To HideUnitNumsvsFlexArray5.TextMatrix(i, 1) = Cw_Hide(i) For j = 1To InputUnitNumsvsFlexArray3.TextMatrix(i, j) = w_InputHide(i, j) Next jNext ivsFlexArray3.SaveGrid App.Path &WeightlN_HD.dat, flexFileAllvsFlexArray5.SaveGrid App.Path &OffsetHIDE.dat, flexFileAllvsFlexArray4.Rows = OutUnitNums + 1vsFlexArray4.Cols = HideUnitNums + 1vsFlexArray6.Rows = OutUnitNums + 1For i = 1 To OutUnitNumsvsFlexArray6.TextMatrix(i, 1) = Cw_Out(i) For j = 1To HideUnitNumsvsFlexAr

展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 大杂烩/其它

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号