Note that in small datasets with 10 classes i.e., CIFAR10, we can find WordNet hypotheses for all nodes. However, in large datasets with 1000 classes i.e., ImageNet, we can only find WordNet hypotheses for a subset of nodes.
Trying NBDTs in under a minute
Interested in trying out an NBDT, now ? Without installing anything, you can view more example outputs online and even try out our web demo . Alternatively, use our command-line utility to run inference (Install with pip install nbdt). Below, we run inference on a picture of a cat .
nbdt https://images.pexels.com/photos/126407/pexels-photo-126407.jpeg?auto=compress&cs=tinysrgb&dpr=2&w=32 # this can also be a path to local image
This outputs both the class prediction and all the intermediate decisions.
Prediction: cat // Decisions: animal (99.47%), chordate (99.20%), carnivore (99.42%), cat (99.86%)
You can load a pretrained NBDT in just a few lines of Python as well. Use the following to get started. We support several neural networks and datasets.
from nbdt.model import HardNBDTfrom nbdt.models import wrn28_10_cifar10model = wrn28_10_cifar10()model = HardNBDT( pretrained=True, dataset='CIFAR10', arch='wrn28_10_cifar10', model=model)
For reference, see the script for the command-line tool we ran above; only ~20 lines are directly involved in transforming the input and running inference. For more instructions on getting started and examples, see our Github repository .
How it Works
The training and inference process for a Neural-Backed Decision Tree can be broken down into four steps.
- Construct a hierarchy for the decision tree. This hierarchy determines which sets of classes the NBDT must decide between. We refer to this hierarchy as an Induced Hierarchy .
- This hierarchy yields a particular loss function, that we call the Tree Supervision Loss ⁵. Train the original neural network, without any modifications , using this new loss.
- Start inference by passing the sample through the neural network backbone. The backbone is all neural network layers before the final fully-connected layer.
- Finish inference by running the final fully-connected layer as a sequence of decision rules, which we call Embedded Decision Rules . These decisions culminate in the final prediction.
For more detail, see our paper (Sec 3).
Conclusion
Explainable AI does not fully explain how the neural network reaches a prediction: Existing methods explain the image’s impact on model predictions but do not explain the decision process. Decision trees address this, but unfortunately, images⁷ are kryptonite for decision tree accuracy.
We thus combine neural networks and decision trees. Unlike predecessors that arrived at the same hybrid design, our neural-backed decision trees (NBDTs) simultaneously address the failures (1) of neural networks to provide justification and (2) of decision trees to attain high accuracy. This primes a new category of accurate, interpretable NBDTs for applications like medicine and finance. To get started, see the project page .
By Alvin Wan , * Lisa Dunlap , * Daniel Ho , Jihan Yin , Scott Lee , Henry Jin , Suzanne Petryk , Sarah Adel Bargal , Joseph E. Gonzalez
where * denotes equal contribution
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
HTML5权威指南
[美] Adam Freeman / 谢廷晟、牛化成、刘美英 / 人民邮电出版社 / 2014-1 / 129.00元
《HTML5 权威指南》是系统学习网页设计的权威参考图书。本书分为五部分:第一部分介绍学习本书的预备知识和HTML、CSS 和JavaScript 的最新进展;第二部分讨论HTML 元素,并详细说明了HTML5中新增和修改的元素;第三部分阐述CSS,涵盖了所有控制内容样式的CSS 选择器和属性,并辅以大量代码示例和图示;第四部分介绍DOM,剖析如何用JavaScript 操纵HTML 内容;第五部......一起来看看 《HTML5权威指南》 这本书的介绍吧!