内容简介:In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of theHopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is anIt has been shown beh
In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of the book that I refer to explains that certain properties could emerge when a set of neurons work together and form a network. A lot of theories are there in the book, but what attracts me more is a network that can simulate how human memory works called Hopfield Network [Hopfield, J.J. 1982].
Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network .
- It is an energy-based network since it uses energy function and minimize the energy to train the weight. Imagine a neuron is triggered caused by an external impulse energy and the energy is propagated throughout the network excites all other connected neurons.
- It is an auto-associative memory network means the trained network can recover full memorized information given only partial information as input. For example, let’s say we have a network consists of 10 neurons connected to each other. When we trigger 8 neurons with so-called input A state , the network will react and calculate back and forth until all the neurons and neuron synapses (the connection between neuron) are stable. Let’s say this stable state called S but now actually the network is implicitly memorized input A state . After that, when we trigger some neurons (same neurons that previously being triggered but not all of them, less than 8), the network will react again until all neurons are stable. And the stable condition is where the previous 8 neurons are in input A state .
- It is a recurrent network means the network output goes back to the network input the network forms a directed graph. In this case, a directed cyclic graph.
- It is a biologically-inspired network since the structure of CA3 region of hippocampus form the similar structure and behavior with Hopfield Network.
It has been shown behaviorally that the CA3 supports spatial rapid one-trial learning, learning of arbitrary associations where space is a component, pattern completion, spatial short-term memory, and sequence learning by associations formed between successive items. [ Cutsuridis, V. & Wennekers, T . 2006]
Different with other neural networks that use backpropagation to learn the weight, Hopfield Network uses Hebb’s learning or Hebbian Learning [ Hebb, D.O. 1949 ] which is also a biologically-inspired learning. The idea behind Hebbian Learning is the connection between two neurons (synapse) will be stronger if the neurons at both ends are actively correlated. In Hopfield Network, neurons only have two states, activated and non-activated. Sometimes people quantified the activated state with 1 and non-activated state with 0. Sometimes they also quantified the activated state with 1 and non-activated state with -1.
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
了不起的Node.js
劳奇 (Guillermo Rauch) / 赵静 / 电子工业出版社 / 2014-1 / 79.00元
本书是一本经典的 Learning by Doing的书籍。它由 Node社区著名的 Socket.IO作者—— Guillermo Rauch,通过大量的实践案例撰写,并由 Node社区非常活跃的开发者—— Goddy Zhao翻译而成。 本书内容主要由对五大部分的介绍组成: Node核心设计理念、 Node核心模块 API、Web开发、数据库以及测试。从前到后、由表及里地对使用 Node......一起来看看 《了不起的Node.js》 这本书的介绍吧!