内容简介:In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of theHopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is anIt has been shown beh
In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of the book that I refer to explains that certain properties could emerge when a set of neurons work together and form a network. A lot of theories are there in the book, but what attracts me more is a network that can simulate how human memory works called Hopfield Network [Hopfield, J.J. 1982].
Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network .
- It is an energy-based network since it uses energy function and minimize the energy to train the weight. Imagine a neuron is triggered caused by an external impulse energy and the energy is propagated throughout the network excites all other connected neurons.
- It is an auto-associative memory network means the trained network can recover full memorized information given only partial information as input. For example, let’s say we have a network consists of 10 neurons connected to each other. When we trigger 8 neurons with so-called input A state , the network will react and calculate back and forth until all the neurons and neuron synapses (the connection between neuron) are stable. Let’s say this stable state called S but now actually the network is implicitly memorized input A state . After that, when we trigger some neurons (same neurons that previously being triggered but not all of them, less than 8), the network will react again until all neurons are stable. And the stable condition is where the previous 8 neurons are in input A state .
- It is a recurrent network means the network output goes back to the network input the network forms a directed graph. In this case, a directed cyclic graph.
- It is a biologically-inspired network since the structure of CA3 region of hippocampus form the similar structure and behavior with Hopfield Network.
It has been shown behaviorally that the CA3 supports spatial rapid one-trial learning, learning of arbitrary associations where space is a component, pattern completion, spatial short-term memory, and sequence learning by associations formed between successive items. [ Cutsuridis, V. & Wennekers, T . 2006]
Different with other neural networks that use backpropagation to learn the weight, Hopfield Network uses Hebb’s learning or Hebbian Learning [ Hebb, D.O. 1949 ] which is also a biologically-inspired learning. The idea behind Hebbian Learning is the connection between two neurons (synapse) will be stronger if the neurons at both ends are actively correlated. In Hopfield Network, neurons only have two states, activated and non-activated. Sometimes people quantified the activated state with 1 and non-activated state with 0. Sometimes they also quantified the activated state with 1 and non-activated state with -1.
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
架构真经
马丁L. 阿伯特(Martin L. Abbott)、迈克尔T.费舍尔(Michael T. Fisher) / 机械工业出版社 / 2017-4 / 79
前言 感谢你对本书第2版感兴趣!作为一本入门、进修和轻量级的参考手册,本书旨在帮助工程师、架构师和管理者研发及维护可扩展的互联网产品。本书给出了一系列规则,每个规则围绕着不同的主题展开讨论。大部分的规则聚焦在技术上,少数规则涉及一些关键的思维或流程问题,每个规则对构建可扩展的产品都是至关重要的。这些规则在深度和焦点上都有所不同。有些规则是高级的,例如定义一个可以应用于几乎任何可扩展性问题的模......一起来看看 《架构真经》 这本书的介绍吧!