Visualizing Episodic Memory with Hopfield Network

栏目: IT技术 · 发布时间: 5年前

内容简介:In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of theHopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is anIt has been shown beh

In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of the book that I refer to explains that certain properties could emerge when a set of neurons work together and form a network. A lot of theories are there in the book, but what attracts me more is a network that can simulate how human memory works called Hopfield Network [Hopfield, J.J. 1982].

Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network .

  • It is an energy-based network since it uses energy function and minimize the energy to train the weight. Imagine a neuron is triggered caused by an external impulse energy and the energy is propagated throughout the network excites all other connected neurons.
  • It is an auto-associative memory network means the trained network can recover full memorized information given only partial information as input. For example, let’s say we have a network consists of 10 neurons connected to each other. When we trigger 8 neurons with so-called input A state , the network will react and calculate back and forth until all the neurons and neuron synapses (the connection between neuron) are stable. Let’s say this stable state called S but now actually the network is implicitly memorized input A state . After that, when we trigger some neurons (same neurons that previously being triggered but not all of them, less than 8), the network will react again until all neurons are stable. And the stable condition is where the previous 8 neurons are in input A state .
  • It is a recurrent network means the network output goes back to the network input the network forms a directed graph. In this case, a directed cyclic graph.
  • It is a biologically-inspired network since the structure of CA3 region of hippocampus form the similar structure and behavior with Hopfield Network.

It has been shown behaviorally that the CA3 supports spatial rapid one-trial learning, learning of arbitrary associations where space is a component, pattern completion, spatial short-term memory, and sequence learning by associations formed between successive items. [ Cutsuridis, V. & Wennekers, T . 2006]

Different with other neural networks that use backpropagation to learn the weight, Hopfield Network uses Hebb’s learning or Hebbian Learning [ Hebb, D.O. 1949 ] which is also a biologically-inspired learning. The idea behind Hebbian Learning is the connection between two neurons (synapse) will be stronger if the neurons at both ends are actively correlated. In Hopfield Network, neurons only have two states, activated and non-activated. Sometimes people quantified the activated state with 1 and non-activated state with 0. Sometimes they also quantified the activated state with 1 and non-activated state with -1.


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

海星模式

海星模式

奥瑞·布莱福曼、罗德·贝克斯特朗 / 李江波 / 中信出版社 / 2008-1 / 36.00元

如果砍掉一只蜘蛛的脑袋,毫无疑问它会死掉;但是砍掉海星的一条手臂,它却会长出一只新的来,就连那只砍掉的手臂也会长成一个完整的新海星。传统意义上自上而下的组织模式就像蜘蛛,然而现在正在改变着企业和世界面貌的却是海星型组织。 维基百科、craigslist和Skype的成功下面隐藏着什么样的力量?易趣公司和通用电气公司与废奴和女权运动又有什么共同之处?到底是什么样的重大选择使得通用汽车公司与丰田......一起来看看 《海星模式》 这本书的介绍吧!

图片转BASE64编码
图片转BASE64编码

在线图片转Base64编码工具

XML、JSON 在线转换
XML、JSON 在线转换

在线XML、JSON转换工具

Markdown 在线编辑器
Markdown 在线编辑器

Markdown 在线编辑器