Note that in small datasets with 10 classes i.e., CIFAR10, we can find WordNet hypotheses for all nodes. However, in large datasets with 1000 classes i.e., ImageNet, we can only find WordNet hypotheses for a subset of nodes.
Trying NBDTs in under a minute
Interested in trying out an NBDT, now ? Without installing anything, you can view more example outputs online and even try out our web demo . Alternatively, use our command-line utility to run inference (Install with pip install nbdt). Below, we run inference on a picture of a cat .
nbdt https://images.pexels.com/photos/126407/pexels-photo-126407.jpeg?auto=compress&cs=tinysrgb&dpr=2&w=32 # this can also be a path to local image
This outputs both the class prediction and all the intermediate decisions.
Prediction: cat // Decisions: animal (99.47%), chordate (99.20%), carnivore (99.42%), cat (99.86%)
You can load a pretrained NBDT in just a few lines of Python as well. Use the following to get started. We support several neural networks and datasets.
from nbdt.model import HardNBDTfrom nbdt.models import wrn28_10_cifar10model = wrn28_10_cifar10()model = HardNBDT( pretrained=True, dataset='CIFAR10', arch='wrn28_10_cifar10', model=model)
For reference, see the script for the command-line tool we ran above; only ~20 lines are directly involved in transforming the input and running inference. For more instructions on getting started and examples, see our Github repository .
How it Works
The training and inference process for a Neural-Backed Decision Tree can be broken down into four steps.
- Construct a hierarchy for the decision tree. This hierarchy determines which sets of classes the NBDT must decide between. We refer to this hierarchy as an Induced Hierarchy .
- This hierarchy yields a particular loss function, that we call the Tree Supervision Loss ⁵. Train the original neural network, without any modifications , using this new loss.
- Start inference by passing the sample through the neural network backbone. The backbone is all neural network layers before the final fully-connected layer.
- Finish inference by running the final fully-connected layer as a sequence of decision rules, which we call Embedded Decision Rules . These decisions culminate in the final prediction.
For more detail, see our paper (Sec 3).
Conclusion
Explainable AI does not fully explain how the neural network reaches a prediction: Existing methods explain the image’s impact on model predictions but do not explain the decision process. Decision trees address this, but unfortunately, images⁷ are kryptonite for decision tree accuracy.
We thus combine neural networks and decision trees. Unlike predecessors that arrived at the same hybrid design, our neural-backed decision trees (NBDTs) simultaneously address the failures (1) of neural networks to provide justification and (2) of decision trees to attain high accuracy. This primes a new category of accurate, interpretable NBDTs for applications like medicine and finance. To get started, see the project page .
By Alvin Wan , * Lisa Dunlap , * Daniel Ho , Jihan Yin , Scott Lee , Henry Jin , Suzanne Petryk , Sarah Adel Bargal , Joseph E. Gonzalez
where * denotes equal contribution
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
MySQL技术内幕
姜承尧 / 机械工业出版社 / 2013-5 / 79.00元
《MySQL技术内幕:InnoDB存储引擎(第2版)》由国内资深MySQL专家亲自执笔,国内外多位数据库专家联袂推荐。作为国内唯一一本关于InnoDB的专著,《MySQL技术内幕:InnoDB存储引擎(第2版)》的第1版广受好评,第2版不仅针对最新的MySQL 5.6对相关内容进行了全面的补充,还根据广大读者的反馈意见对第1版中存在的不足进行了完善,《MySQL技术内幕:InnoDB存储引擎(第2......一起来看看 《MySQL技术内幕》 这本书的介绍吧!