Confusion Matrixis a matrix that illustrates the performance of a classification model when exposed to unseen data. This matrix helps us to identify how the model is performing on test set. From this matrix, many other scores are calculated such as Accuracy, Recall, Precision, F1-score, etc. It is important one should know where to use which type of score as it depends on the application.
There are two classes: Class 1 and Class 2
Class 1:Positive
Class 2: Negative
Positive: Observation is True (eg. Picture is a dog)
Negative: Observation is False (eg. Picture is not a dog)
T.P.(True Positive): Truth and Prediction both are Positive
T.N.(True Negative): Truth and Prediction both are Negative
F.P.(False Positive): Truth is Negative but Prediction is Positive
F.N.(False Negative): Truth is Positive but Prediction is Negative
Accuracy:
Accuracy is the ratio of sum of True Positive(T.P.) and True Negative(T.N.) to the sum of the matrix elements.
Precision:
Precision is defined as the ratio of True Positive(T.P) to the sum of True Positive(T.P) and False Positive(F.P)
Recall:
Recall is defined as the ratio of True Positive(T.P) to the sum of True Positive(T.P) and False Negative(F.N)
High recall, low precision:This means that most of the positive examples are correctly recognized (low FN) but there are a lot of false positives.
Low recall, high precision:This shows that we miss a lot of positive examples (high FN) but those we predict as positive are indeed positive (low FP)
F1-score:
Since we have two measures (Precision and Recall) it helps to have a measurement that represents both of them. We calculate an F-measure which uses Harmonic Mean in place of Arithmetic Mean as it punishes the extreme values more.
The F-Measure will always be nearer to the smaller value of Precision or Recall.
Exercise
Accuracy
Accuracy = (TP + TN) / (TP + TN + FP + FN) = (100 + 50) /(100 + 5 + 10 + 50) = 0.90Precision
Precision tells us about when it predicts yes, how often is it correct.Precision = TP / (TP + FP)=100/ (100+10) = 0.91
Recall
Recall gives us an idea about when it’s actually yes, how often does it predict yes.
Recall = TP / (TP + FN) = 100 / (100 + 5) = 0.95
F-score
F1-score = (2 * Recall * Precision) / (Recall + Presision) = (2 * 0.95 * 0.91) / (0.91 + 0.95) = 0.9Got any questions?
Email: amarmandal2153@gmail.com
Thank youuuu…
以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
解构产品经理:互联网产品策划入门宝典
电子工业出版社 / 2018-1 / 65
《解构产品经理:互联网产品策划入门宝典》以作者丰富的职业背景及著名互联网公司的工作经验为基础,从基本概念、方法论和工具的解构入手,配合大量正面或负面的案例,完整、详细、生动地讲述了一个互联网产品经理入门所需的基础知识。同时,在此基础上,将这些知识拓展出互联网产品策划的领域,融入日常工作生活中,以求职、沟通等场景为例,引导读者将知识升华为思维方式。 《解构产品经理:互联网产品策划入门宝典》适合......一起来看看 《解构产品经理:互联网产品策划入门宝典》 这本书的介绍吧!