Batch vs Stochastic Gradient Descent

栏目: IT技术 · 发布时间: 5年前

Batch vs Stochastic Gradient Descent

Learn difference between Batch & Stochastic Gradient Descent and choose best descent for your model.

May 31 ·4min read

Batch vs Stochastic Gradient Descent

Photo by Bailey Zindel on Unsplash

Before diving into Gradient Descent, we’ll look how a Linear Regression model deals with Cost function. Main motive to reach Global minimum is to minimize Cost function which is given by,

Batch vs Stochastic Gradient Descent

Here, Hypothesis represents linear equation where, theta(0) is the bias AKA intercept and theta(1) are the weight(slope) given to the feature ‘x’.

Batch vs Stochastic Gradient Descent

Fig: 1

Weights and intercept are randomly initialized taking baby step to reach minimum point. An important parameter in Gradient Descent is the size of the steps, determined by the learning rate hyper-parameter. It’s important to note that if we set high value of learning rate, point will end up taking large steps and probably will not reach global minimum( having large errors). On the other hand, if we take small value of learning rate, purple point will take large amount of time to reach global minimum. Therefore, Optimal learning rate should be taken.


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

趣学算法

趣学算法

陈小玉 / 人民邮电出版社 / 2017-7-1 / 89.00元

本书内容按照算法策略分为7章。 第1章从算法之美、简单小问题、趣味故事引入算法概念、时间复杂度、空间复杂度的概念和计算方法,以及算法设计的爆炸性增量问题,使读者体验算法的奥妙。 第2~7章介绍经典算法的设计策略、实战演练、算法分析及优化拓展,分别讲解贪心算法、分治算法、动态规划、回溯法、分支限界法、线性规划和网络流。每一种算法都有4~10个实例,共50个大型实例,包括经典的构造实例和实......一起来看看 《趣学算法》 这本书的介绍吧!

正则表达式在线测试
正则表达式在线测试

正则表达式在线测试

RGB CMYK 转换工具
RGB CMYK 转换工具

RGB CMYK 互转工具

HEX HSV 转换工具
HEX HSV 转换工具

HEX HSV 互换工具