Visualizing Episodic Memory with Hopfield Network

栏目: IT技术 · 发布时间: 4年前

内容简介:In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of theHopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is anIt has been shown beh

In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of the book that I refer to explains that certain properties could emerge when a set of neurons work together and form a network. A lot of theories are there in the book, but what attracts me more is a network that can simulate how human memory works called Hopfield Network [Hopfield, J.J. 1982].

Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network .

  • It is an energy-based network since it uses energy function and minimize the energy to train the weight. Imagine a neuron is triggered caused by an external impulse energy and the energy is propagated throughout the network excites all other connected neurons.
  • It is an auto-associative memory network means the trained network can recover full memorized information given only partial information as input. For example, let’s say we have a network consists of 10 neurons connected to each other. When we trigger 8 neurons with so-called input A state , the network will react and calculate back and forth until all the neurons and neuron synapses (the connection between neuron) are stable. Let’s say this stable state called S but now actually the network is implicitly memorized input A state . After that, when we trigger some neurons (same neurons that previously being triggered but not all of them, less than 8), the network will react again until all neurons are stable. And the stable condition is where the previous 8 neurons are in input A state .
  • It is a recurrent network means the network output goes back to the network input the network forms a directed graph. In this case, a directed cyclic graph.
  • It is a biologically-inspired network since the structure of CA3 region of hippocampus form the similar structure and behavior with Hopfield Network.

It has been shown behaviorally that the CA3 supports spatial rapid one-trial learning, learning of arbitrary associations where space is a component, pattern completion, spatial short-term memory, and sequence learning by associations formed between successive items. [ Cutsuridis, V. & Wennekers, T . 2006]

Different with other neural networks that use backpropagation to learn the weight, Hopfield Network uses Hebb’s learning or Hebbian Learning [ Hebb, D.O. 1949 ] which is also a biologically-inspired learning. The idea behind Hebbian Learning is the connection between two neurons (synapse) will be stronger if the neurons at both ends are actively correlated. In Hopfield Network, neurons only have two states, activated and non-activated. Sometimes people quantified the activated state with 1 and non-activated state with 0. Sometimes they also quantified the activated state with 1 and non-activated state with -1.


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

高效前端:Web高效编程与优化实践

高效前端:Web高效编程与优化实践

李银城 著 / 机械工业出版社 / 2018-3-15 / 89.00元

这不是一本单纯讲解前端编程技巧的书,而是一本注重思想提升和内功修炼的书。 全书以问题为导向,精选了前端开发中的34个疑难问题,从分析问题的原因入手,逐步给出解决方案,并分析各种方案的优劣,最后针对每个问题总结出高效编程的最佳实践和各种性能优化的方法。 全书共7章,内容从逻辑上大致可以分为两大类: 第一类,偏向实践,围绕HTML、CSS、JavaScript等传统前端技术,以及PW......一起来看看 《高效前端:Web高效编程与优化实践》 这本书的介绍吧!

JS 压缩/解压工具
JS 压缩/解压工具

在线压缩/解压 JS 代码

在线进制转换器
在线进制转换器

各进制数互转换器

HEX HSV 转换工具
HEX HSV 转换工具

HEX HSV 互换工具