TensorFlow2_200729系列---22、cifar10分类实战

一、总结

一句话总结:

The CIFAR-10 and CIFAR-100 are labeled subsets of the 80 million tiny images dataset. 网址:http://www.cs.toronto.edu/~kriz/cifar.html

 

 

1、CIFAR-10数据下载之后放哪里?

用的是keras的dataset的功能 ,那么下载的文件肯定放在keras文件夹中:C:\Users\xxx\.keras\datasets目录中

 

 

2、优化网络的常见方式(本例)?

1、归一化中给数据-1和1:x = 2 * tf.cast(x, dtype=tf.float32) / 255. - 1.
2、增加每层节点:由32*32*3->256->128->64->32->10变成32*32*3->256->256->256->256->10
3、增加层

 

 

二、cifar10分类实战

博客对应课程的视频位置:

 







import  os
os.environ['TF_CPP_MIN_LOG_LEVEL']='2'

import tensorflow as tf
from tensorflow.keras import datasets, layers, optimizers, Sequential, metrics
from tensorflow import keras



# 归一化
def preprocess(x, y):
# 常见优化方式一:[0~255] => [-1~1]
# -1到1最适合神经网络
x = 2 * tf.cast(x, dtype=tf.float32) / 255. - 1.
y = tf.cast(y, dtype=tf.int32)
return x,y


batchsz = 128
# [50k, 32, 32, 3], [10k, 1]
(x, y), (x_val, y_val) = datasets.cifar10.load_data()
y = tf.squeeze(y)
y_val = tf.squeeze(y_val)
y = tf.one_hot(y, depth=10) # [50k, 10]
y_val = tf.one_hot(y_val, depth=10) # [10k, 10]
print('datasets:', x.shape, y.shape, x_val.shape, y_val.shape, x.min(), x.max())


train_db = tf.data.Dataset.from_tensor_slices((x,y))
train_db = train_db.map(preprocess).shuffle(10000).batch(batchsz)
test_db = tf.data.Dataset.from_tensor_slices((x_val, y_val))
test_db = test_db.map(preprocess).batch(batchsz)


sample = next(iter(train_db))
print('batch:',