下面我们使用Tensorflow来实现上一节中的多层感知机。首先导入所需的包或模块。
import tensorflow as tf
from tensorflow import keras
import sys
sys.path.append("..")
from tensorflow import keras
fashion_mnist = keras.datasets.fashion_mnist
和softmax回归唯一的不同在于,我们多加了一个全连接层作为隐藏层。它的隐藏单元个数为256,并使用ReLU函数作为激活函数。
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(256, activation='relu',),
tf.keras.layers.Dense(10, activation='softmax')
])
我们使用与3.7节中训练softmax回归几乎相同的步骤来读取数据并训练模型。
fashion_mnist = keras.datasets.fashion_mnist
(x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()
x_train = x_train / 255.0
x_test = x_test / 255.0
model.compile(optimizer=tf.keras.optimizers.SGD(lr=0.5),
loss = 'sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(x_train, y_train, epochs=5,
batch_size=256,
validation_data=(x_test, y_test),
validation_freq=1)
输出:
Train on 60000 samples, validate on 10000 samples
Epoch 1/5
60000/60000 [==============================] - 2s 33us/sample - loss: 0.7428 - accuracy: 0.7333 - val_loss: 0.5489 - val_accuracy: 0.8049
Epoch 2/5
60000/60000 [==============================] - 1s 22us/sample - loss: 0.4774 - accuracy: 0.8247 - val_loss: 0.4823 - val_accuracy: 0.8288
Epoch 3/5
60000/60000 [==============================] - 1s 21us/sample - loss: 0.4111 - accuracy: 0.8497 - val_loss: 0.4448 - val_accuracy: 0.8401
Epoch 4/5
60000/60000 [==============================] - 1s 21us/sample - loss: 0.3806 - accuracy: 0.8600 - val_loss: 0.5326 - val_accuracy: 0.8132
Epoch 5/5
60000/60000 [==============================] - 1s 21us/sample - loss: 0.3603 - accuracy: 0.8681 - val_loss: 0.4217 - val_accuracy: 0.8448
<tensorflow.python.keras.callbacks.History at 0x7f9868e12310>
- 通过Tensorflow2.0可以更简洁地实现多层感知机。
注:本节除了代码之外与原书基本相同,原书传送门