Note that in small datasets with 10 classes i.e., CIFAR10, we can find WordNet hypotheses for all nodes. However, in large datasets with 1000 classes i.e., ImageNet, we can only find WordNet hypotheses for a subset of nodes.
Trying NBDTs in under a minute
Interested in trying out an NBDT, now ? Without installing anything, you can view more example outputs online and even try out our web demo . Alternatively, use our command-line utility to run inference (Install with pip install nbdt). Below, we run inference on a picture of a cat .
nbdt https://images.pexels.com/photos/126407/pexels-photo-126407.jpeg?auto=compress&cs=tinysrgb&dpr=2&w=32 # this can also be a path to local image
This outputs both the class prediction and all the intermediate decisions.
Prediction: cat // Decisions: animal (99.47%), chordate (99.20%), carnivore (99.42%), cat (99.86%)
You can load a pretrained NBDT in just a few lines of Python as well. Use the following to get started. We support several neural networks and datasets.
from nbdt.model import HardNBDTfrom nbdt.models import wrn28_10_cifar10model = wrn28_10_cifar10()model = HardNBDT( pretrained=True, dataset='CIFAR10', arch='wrn28_10_cifar10', model=model)
For reference, see the script for the command-line tool we ran above; only ~20 lines are directly involved in transforming the input and running inference. For more instructions on getting started and examples, see our Github repository .
How it Works
The training and inference process for a Neural-Backed Decision Tree can be broken down into four steps.
- Construct a hierarchy for the decision tree. This hierarchy determines which sets of classes the NBDT must decide between. We refer to this hierarchy as an Induced Hierarchy .
- This hierarchy yields a particular loss function, that we call the Tree Supervision Loss ⁵. Train the original neural network, without any modifications , using this new loss.
- Start inference by passing the sample through the neural network backbone. The backbone is all neural network layers before the final fully-connected layer.
- Finish inference by running the final fully-connected layer as a sequence of decision rules, which we call Embedded Decision Rules . These decisions culminate in the final prediction.
For more detail, see our paper (Sec 3).
Conclusion
Explainable AI does not fully explain how the neural network reaches a prediction: Existing methods explain the image’s impact on model predictions but do not explain the decision process. Decision trees address this, but unfortunately, images⁷ are kryptonite for decision tree accuracy.
We thus combine neural networks and decision trees. Unlike predecessors that arrived at the same hybrid design, our neural-backed decision trees (NBDTs) simultaneously address the failures (1) of neural networks to provide justification and (2) of decision trees to attain high accuracy. This primes a new category of accurate, interpretable NBDTs for applications like medicine and finance. To get started, see the project page .
By Alvin Wan , * Lisa Dunlap , * Daniel Ho , Jihan Yin , Scott Lee , Henry Jin , Suzanne Petryk , Sarah Adel Bargal , Joseph E. Gonzalez
where * denotes equal contribution
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
TCP/IP详解 卷3:TCP事务协议、HTTP、NNTP和UNIX域协议
胡谷雨、吴礼发、W.Richard Stevens / 胡谷雨 / 机械工业出版社 / 2000-9 / 35.00元
《CP.IP详解(卷3):CP事务协议.HP.P和UIX域协议》是“TCP/IP详解系列”的延续。主要内容包括:TCP事务协议,即T/TCP,这是对TCP的扩展,使客户-服务器事务更快、更高效和更可靠;TCP/IP应用,主要是HTTP和NNTP;UNIX域协议,这些协议提供了进程之间通信的一种手段。当客户与服务器进程在同一台主机上时,UNIX域协议通常要比TCP/IP快一倍。《CP.IP详解(卷3......一起来看看 《TCP/IP详解 卷3:TCP事务协议、HTTP、NNTP和UNIX域协议》 这本书的介绍吧!