内容简介:In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of theHopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is anIt has been shown beh
In 2018, I wrote anarticledescribing the neural model and its relation to artificial neural networks. One chapter of the book that I refer to explains that certain properties could emerge when a set of neurons work together and form a network. A lot of theories are there in the book, but what attracts me more is a network that can simulate how human memory works called Hopfield Network [Hopfield, J.J. 1982].
Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network .
- It is an energy-based network since it uses energy function and minimize the energy to train the weight. Imagine a neuron is triggered caused by an external impulse energy and the energy is propagated throughout the network excites all other connected neurons.
- It is an auto-associative memory network means the trained network can recover full memorized information given only partial information as input. For example, let’s say we have a network consists of 10 neurons connected to each other. When we trigger 8 neurons with so-called input A state , the network will react and calculate back and forth until all the neurons and neuron synapses (the connection between neuron) are stable. Let’s say this stable state called S but now actually the network is implicitly memorized input A state . After that, when we trigger some neurons (same neurons that previously being triggered but not all of them, less than 8), the network will react again until all neurons are stable. And the stable condition is where the previous 8 neurons are in input A state .
- It is a recurrent network means the network output goes back to the network input the network forms a directed graph. In this case, a directed cyclic graph.
- It is a biologically-inspired network since the structure of CA3 region of hippocampus form the similar structure and behavior with Hopfield Network.
It has been shown behaviorally that the CA3 supports spatial rapid one-trial learning, learning of arbitrary associations where space is a component, pattern completion, spatial short-term memory, and sequence learning by associations formed between successive items. [ Cutsuridis, V. & Wennekers, T . 2006]
Different with other neural networks that use backpropagation to learn the weight, Hopfield Network uses Hebb’s learning or Hebbian Learning [ Hebb, D.O. 1949 ] which is also a biologically-inspired learning. The idea behind Hebbian Learning is the connection between two neurons (synapse) will be stronger if the neurons at both ends are actively correlated. In Hopfield Network, neurons only have two states, activated and non-activated. Sometimes people quantified the activated state with 1 and non-activated state with 0. Sometimes they also quantified the activated state with 1 and non-activated state with -1.
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
iOS Web应用开发
皮基 (Andrea Picchi) / 罗晴明 / 人民邮电出版社 / 2013-8-1 / CNY 79.00
本书介绍了如何使用Web标准技术来为iPhone和iPad制作Web应用。书中利用最前沿的Web和移动技术,演示了如何使用HTML5来完成繁重的基础工作,如何使用CSS3来制作外观,以及如何使用JavaScript来为移动网站或Web应用添加程序逻辑。 通过阅读本书,读者可以掌握面向移动的项目的开发流程。作者逐章递进,引导读者了解iOS设计与开发的各个步骤。读者可以学习到如下知识: 设......一起来看看 《iOS Web应用开发》 这本书的介绍吧!