Computational Cognitive Neuroscience Lab:计算认知神经科学实验室

上传人:e****s 文档编号:247193869 上传时间:2022-01-28 格式:PPT 页数:14 大小:84.50KB
返回 下载 相关 举报
Computational Cognitive Neuroscience Lab:计算认知神经科学实验室_第1页
第1页 / 共14页
Computational Cognitive Neuroscience Lab:计算认知神经科学实验室_第2页
第2页 / 共14页
Computational Cognitive Neuroscience Lab:计算认知神经科学实验室_第3页
第3页 / 共14页
Computational Cognitive Neuroscience Lab:计算认知神经科学实验室_第4页
第4页 / 共14页
Computational Cognitive Neuroscience Lab:计算认知神经科学实验室_第5页
第5页 / 共14页
点击查看更多>>
资源描述

《Computational Cognitive Neuroscience Lab:计算认知神经科学实验室》由会员分享,可在线阅读,更多相关《Computational Cognitive Neuroscience Lab:计算认知神经科学实验室(14页珍藏版)》请在金锄头文库上搜索。

1、Click to edit Master title styleClick to edit Master subtitle style*1Computational Cognitive Neuroscience Lab Today: Model LearningComputational Cognitive Neuroscience Lab Today: Homework is due Friday, Feb 17 Chapter 4 homework is shorter than the last one! Undergrads omit Hebbian Learning “Neurons

2、 that fire together, wire together” Correlations between sending and receiving activity strengthens the connection between them “Dont fire together, unwire” Anti-correlation between sending and receiving activity weakens the connectionLTP/D via NMDA receptors NMDA receptors allow calcium to enter th

3、e (postsynaptic) cell NMDA are blocked by Mg+ ions, which are cast off when the membrane potential increases Glutamate (excitatory) binds to unblocked NMDA receptor, causes structural change that allows Ca+ to pass throughCalcium and Synapses Calcium initiates multiple chemical pathways, dependent o

4、n the level of calcium Low Ca+ long term depression (LTD) High Ca+ long term potentiation (LTP) LTP/D effects: new postsynaptic receptors, incresed dendritic spine size, or increased presynaptc release processes (via retrograde messenger)Fixing Hebbian learning Hebbian learning results in infinite w

5、eights! Ojas normalization (savg_corr) When to learn? Conditional PCA-learn only when you see something interesting A single unit hogs everything? kWTA and Contrast enhancement - specializationPrincipal Components Analysis (PCA) Principal, as in primary, not principle, as in some idea PCA seeks a li

6、near combination of variables such that maximum variance is extracted from the variables. It then removes this variance and seeks a second linear combination which explains the maximum proportion of the remaining variance, and so on until you run out of variance.PCA continued This is like linear reg

7、ression, except you take the whole collection of variables (vector) and correlate it with itself to make a matrix. This is kind of like linear regression, where a whole collection of variables is regressed on itself The line of best fit through this regression is the first principal component!PCA ca

8、rtoonConditional PCA “Perform PCA only when a particular input is received” Condition: The forces that determine when a receiving unit is active Competition means hidden units will specialize for particular inputs So hidden units only learn when their favorite input is availableSelf-organizing learn

9、ing kWTA determines which hidden units are active for a given input CPCA ensures those hidden units learn only about a single aspect of that input Contrast enhancement - drive high weighs higher, low weights lower Contrast enhancement helps units specialize (and share)Bias-variance dilemma High bias

10、-actual experience does not change model much, so biases better be good! Low bias-experience highly determines learning, so does random error! Model could be different, high model varianceArchitecture as Bias Inhibition drives competition, and competition determines which units are active, and the u

11、nit activity determines learning Thus, deciding which units share inhibitory connections (are in the same layer) will affect the learning This architecture is the learning bias!Fidelity and Simplicity of representations Information must be lost in the world-to-brain transformation (p118) There is a tradeoff in the amount of information lost, and the complexity of the representation Fidelity / simplicity tradeoff is set by Conditional PCA (first principal component only) Competition (k value) Contrast enhancement (savg_corr, wt_gain)

展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 经济/贸易/财会 > 经济学

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号