Automatic Differentiation via Contour Integration

栏目: IT技术 · 发布时间: 5年前

内容简介:Automatic Differentiation via Contour IntegrationThere has previously been some back-and-forth among scientists about whether biological networks such as brains might compute derivatives. I have previously made my position on this issue clear:The standard

AutoDiff

Automatic Differentiation via Contour Integration

Motivation:

There has previously been some back-and-forth among scientists about whether biological networks such as brains might compute derivatives. I have previously made my position on this issue clear: https://twitter.com/bayesianbrain/status/1202650626653597698

The standard counter-argument is that backpropagation isn't biologically plausible but partial derivatives are very useful for closed-loop control so we are faced with a fundamental question we can't ignore. How might large branching structures in the brain and other biological systems compute derivatives?

After some reflection I realised that an important result in complex analysis due to Cauchy, the Cauchy Integral Formula, may be used to compute derivatives with a simple forward propagation of signals using a monte-carlo method. Incidentally, Cauchy also discovered the gradient descent algorithm.

Minimal implementation in the Julia language:

function mc_nabla(f, x::Float64, delta::Float64)

  ## automatic differentiation of holomorphic functions in a single complex variable
  ## applied to real-valued functions in a single variable

  N = round(Int,2*pi/delta)

  ## sample with only half the number of points: 
  sample = rand(1:N,round(Int,N/2)) 
  thetas = sample*delta

  ## collect arguments and rotations: 
  rotations = map(theta -> exp(-im*theta),thetas)
  arguments = x .+ conj.(rotations)  

  ## calculate expectation: 
  expectation = (2.0/N)*real(sum(map(f,arguments).*rotations))

  return expectation

end

Blog post:

https://keplerlounge.com/neural-computation/2020/01/16/complex-auto-diff.html

Jupyter Notebook:

https://github.com/AidanRocke/AutoDiff/blob/master/cauchy_tutorial.ipynb


以上所述就是小编给大家介绍的《Automatic Differentiation via Contour Integration》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

人人都是产品经理——写给产品新人

人人都是产品经理——写给产品新人

苏杰 / 电子工业出版社 / 2017-6 / 66.60

《人人都是产品经理——写给产品新人》为经典畅销书《人人都是产品经理》的内容升级版本,和《人人都是产品经理2.0——写给泛产品经理》相当于上下册的关系。对于大量成长起来的优秀互联网产品经理、众多想投身产品工作的其他岗位从业者,以及更多有志从事这一职业的学生而言,这《人人都是产品经理——写给产品新人》曾是他们记忆深刻的启蒙读物、思想基石和行动手册。作者以分享经历与体会为出发点,以“朋友间聊聊如何做产品......一起来看看 《人人都是产品经理——写给产品新人》 这本书的介绍吧!

XML、JSON 在线转换
XML、JSON 在线转换

在线XML、JSON转换工具

正则表达式在线测试
正则表达式在线测试

正则表达式在线测试