Implementing Dropout Technique
Using TensorFlow and Keras, we are equipped with the tools to implement a neural network that utilizes the dropout technique by including dropout layers within the neural network architecture.
We only need to add one line to include a dropout layer within a more extensive neural network architecture. The Dropout class takes a few arguments, but for now, we are only concerned with the ‘rate’ argument. The dropout rate is a hyperparameter that represents the likelihood of a neuron activation been set to zero during a training step. The rate argument can take values between 0 and 1.
keras.layers.Dropout(rate=0.2)
From this point onwards, we will go through small steps taken to implement, train and evaluate a neural network.
- Load tools and libraries utilized, Keras and TensorFlow
import tensorflow as tf from tensorflow import keras
2. Load the FashionMNIST dataset, normalize images and partition dataset into test, training and validation data.
(train_images, train_labels),(test_images, test_labels) = keras.datasets.fashion_mnist.load_data() train_images = train_images / 255.0 test_images = test_images / 255.0 validation_images = train_images[:5000] validation_labels = train_labels[:5000]
3. Create a custom model that includes a dropout layer using the Keras Model Class API.
class CustomModel(keras.Model): def __init__(self, **kwargs): super().__init__(**kwargs) self.input_layer = keras.layers.Flatten(input_shape=(28,28)) self.hidden1 = keras.layers.Dense(200, activation='relu') self.hidden2 = keras.layers.Dense(100, activation='relu') self.hidden3 = keras.layers.Dense(60, activation='relu') self.output_layer = keras.layers.Dense(10, activation='softmax') self.dropout_layer = keras.layers.Dropout(rate=0.2) def call(self, input): input_layer = self.input_layer(input) input_layer = self.dropout_layer(input_layer) hidden1 = self.hidden1(input_layer) hidden1 = self.dropout_layer(hidden1) hidden2 = self.hidden2(hidden1) hidden2 = self.dropout_layer(hidden2) hidden3 = self.hidden3(hidden2) hidden3 = self.dropout_layer(hidden3) output_layer = self.output_layer(hidden3) return output_layer
4. Load the implemented model and initialize both optimizers and hyperparameters.
model = CustomModel() sgd = keras.optimizers.SGD(lr=0.01) model.compile(loss="sparse_categorical_crossentropy", optimizer=sgd, metrics=["accuracy"])
5. Train the model for a total of 60 epochs
model.fit(train_images, train_labels, epochs=60, validation_data=(validation_images, validation_labels))
6. Evaluate the model on the test dataset
model.evaluate(test_images, test_labels)
The result of the evaluation will look similar to the example evaluation result below:
10000/10000 [==============================] - 0s 34us/sample - loss: 0.3230 - accuracy: 0.8812[0.32301584649085996, 0.8812]
The accuracy shown in the evaluation result example corresponds to the accuracy of our model of 88%.
With some fine-tuning and training with more significant epoch numbers, the accuracy could be increased by a few percentages.
Here’s a GitHub repository for the code presented in this article.
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
从界面到网络空间
(美)海姆 / 金吾伦/刘钢 / 上海科技教育出版社 / 2000-7 / 16.40元
计算机急剧改变了20世纪的生活。今天,我们凭借遍及全球的计算机网络加速了过去以广播、报纸和电视形式进行的交流。思想风驰电掣般在全球翻飞。仅在角落中潜伏着已完善的虚拟实在。在虚拟实在吕,我们能将自己沉浸于感官模拟,不仅对现实世界,也对假想世界。当我们开始在真实世界与虚拟世界之间转换时,迈克尔·海姆问,我们对实在的感觉如何改变?在〈从界面到网络空间〉中,海姆探讨了这一问题,以及信息时代其他哲学问题。他......一起来看看 《从界面到网络空间》 这本书的介绍吧!
随机密码生成器
多种字符组合密码
HTML 编码/解码
HTML 编码/解码