Backpropagation from scratch on Mini-Batches

栏目: IT技术 · 发布时间: 4年前

内容简介:Implementation of Backpropagation algo on mini-batches with step by step execution of equations.You must be thinking, another Backprop from scratch blog? Well kinda yes but I thought this through and came up with something that you can use to tinker around

Implementation of Backpropagation algo on mini-batches with step by step execution of equations.

Apr 19 ·4min read

Backpropagation from scratch on Mini-Batches

Photo by john foust on Unsplash

You must be thinking, another Backprop from scratch blog? Well kinda yes but I thought this through and came up with something that you can use to tinker around along with easy to understand equations that you usually write down to understand the algorithm.

This blog will focus on implementing the Backpropagation algorithm step-by-step on mini-batches of the dataset. There are plenty of tutorials and blogs to demonstrate the backpropagation algorithm in detail and all the logic behind calculus and algebra happening. So I’ll skip that part and cut to equations in math and implementation using Python (coz why not).

Why from scratch?

This has been a long time community question as to why we should implement an algorithm from scratch even if it’s been readily available to put to use by almost all frameworks. Evidently while using certain high-level frameworks you can’t even notice backpropagation doing its magic. To understand it upside down, in and out completely you should once try to make your hands dirty with this stuff. Backpropagation is something on which experimentation can be done while playing around.

Why Mini-Batches?

The reason behind mini-batches is simple. It saves memory and processing time by dividing data into mini-batches and supply the algorithm a fraction of the dataset on each iteration of the training loop. Feeding a 10000x10000 matrix at once would not only blow up memory but would take a long time to run. Instead, bringing it down to 50 per iteration would not only reduce memory usage but you can track progress.

Note-This is different from the stochastic method where we take a stratified sample from data for each class and train on that assuming the model would generalize.

Implementation time!

This is the head of the data I’ll be using for this implementation.

Backpropagation from scratch on Mini-Batches

The target variable here is Occupancy which is a categorical variable (0/1).

This will be the architecture we’ll be coding.

Backpropagation from scratch on Mini-Batches

Algorithm:

for i:=1 to i:=m:

  1. Perform Forward propagation or Forward pass to calculate Activation values of neurons in each layer.
Backpropagation from scratch on Mini-Batches

2. Backpropagation step:

  • Calculate the error term (MSE or LogLoss or your wish) using the label in the data:
Backpropagation from scratch on Mini-Batches
  • Error terms in the hidden layers are calculated using:

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

另一个地球

另一个地球

[美]马克·格雷厄姆、威廉·H·达顿 / 胡泳、徐嫩羽 / 电子工业出版社 / 2015-10-1 / 78

互联网在日常工作和生活中扮演日益重要的角色,互联网将如何重塑社会?本书通过汇集有关互联网文化、经济、政治角色等问题的研究成果,提供了特定社会制度背景下解决这一问题的根本办法。 关于互联网的研究是蓬勃发展的崭新领域,牛津大学互联网研究院(OII)作为创新型的跨学科学院,自成立起就专注于互联网研究。牛津大学互联网研究院关于互联网+社会的系列讲座在一定程度上塑造了互联网+社会。本书内容基于不同学科......一起来看看 《另一个地球》 这本书的介绍吧!

CSS 压缩/解压工具
CSS 压缩/解压工具

在线压缩/解压 CSS 代码

HTML 编码/解码
HTML 编码/解码

HTML 编码/解码

正则表达式在线测试
正则表达式在线测试

正则表达式在线测试