本教程的知识点为:深度学习介绍 1.1 深度学习与机器学习的区别 TensorFlow介绍 2.4 张量 2.4.1 张量(Tensor) 2.4.1.1 张量的类型 TensorFlow介绍 1.2 神经网络基础 1.2.1 Logistic回归 1.2.1.1 Logistic回归 TensorFlow介绍 总结 每日作业 神经网络与tf.keras 1.3 神经网络基础 神经网络与tf.keras 1.3 Tensorflow实现神经网络 1.3.1 TensorFlow keras介绍 1.3.2 案例:实现多层神经网络进行时装分类 神经网络与tf.keras 1.4 深层神经网络 为什么使用深层网络 1.4.1 深层神经网络表示 卷积神经网络 3.1 卷积神经网络(CNN)原理 为什么需要卷积神经网络 原因之一:图像特征数量对神经网络效果压力 卷积神经网络 3.1 卷积神经网络(CNN)原理 为什么需要卷积神经网络 原因之一:图像特征数量对神经网络效果压力 卷积神经网络 2.2案例:CIFAR100类别分类 2.2.1 CIFAR100数据集介绍 2.2.2 API 使用 卷积神经网络 2.4 BN与神经网络调优 2.4.1 神经网络调优 2.4.1.1 调参技巧 卷积神经网络 2.4 经典分类网络结构 2.4.1 LeNet-5解析 2.4.1.1 网络结构 卷积神经网络 2.5 CNN网络实战技巧 2.5.1 迁移学习(Transfer Learning) 2.5.1.1 介绍 卷积神经网络 总结 每日作业 商品物体检测项目介绍 1.1 项目演示 商品物体检测项目介绍 3.4 Fast R-CNN 3.4.1 Fast R-CNN 3.4.1.1 RoI pooling YOLO与SSD 4.3 案例:SSD进行物体检测 4.3.1 案例效果 4.3.2 案例需求 商品检测数据集训练 5.2 标注数据读取与存储 5.2.1 案例:xml读取本地文件存储到pkl 5.2.1.1 解析结构
完整笔记资料代码->:https://gitee.com/yinuo112/AI/tree/master/深度学习/嘿马深度学习笔记/note.md
感兴趣的小伙伴可以自取哦~
全套教程部分目录:
部分文件图片:
神经网络与tf.keras
1.3 Tensorflow实现神经网络
学习目标
-
目标
- 掌握Tensorflow API的使用
-
应用
- 应用TF搭建一个分类模型
1.3.1 TensorFlow keras介绍
Keras 是一个用于构建和训练深度学习模型的高阶 API。它可用于快速设计原型、高级研究和生产,具有以下三个主要优势:
-
方便用户使用,快速构建模型 Keras 具有针对常见用例做出优化的简单而一致的界面。它可针对用户错误提供切实可行的清晰反馈。
-
模块化和可组 将可配置的构造块连接在一起就可以构建 Keras 模型,并且几乎不受限制。
-
导入:
import tensorflow as tf
from tensorflow import keras
- 1、获取相关现有数据集(无需自己去构造)
-
keras.datasets
- mnist:手写数字
- fashion_mnist:时尚分类
- cifar10(100):10个类别分类
-
fashion_mnist = keras.datasets.fashion_mnist
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()
print(train_images, train_labels)
- 2、构建模型
- 在 Keras 中,您可以通过组合层来构建模型。模型(通常)是由层构成的图。最常见的模型类型是层的堆叠,keras.layers中就有很多模型,如下图:可以在源码文件中找到
- tf.keras.Sequential模型(layers如下)
from tensorflow.python.keras.layers import Dense
from tensorflow.python.keras.layers import DepthwiseConv2D
from tensorflow.python.keras.layers import Dot
from tensorflow.python.keras.layers import Dropout
from tensorflow.python.keras.layers import ELU
from tensorflow.python.keras.layers import Embedding
from tensorflow.python.keras.layers import Flatten
from tensorflow.python.keras.layers import GRU
from tensorflow.python.keras.layers import GRUCell
from tensorflow.python.keras.layers import LSTMCell
...
...
...
-
Flatten:将输入数据进行形状改变展开
-
Dense:添加一层神经元
-
Dense(units,activation=None,**kwargs)
- units:神经元个数
- activation:激活函数,参考tf.nn.relu,tf.nn.softmax,tf.nn.sigmoid,tf.nn.tanh
- **kwargs:输入上层输入的形状,input_shape=()
-
tf.keras.Sequential构建类似管道的模型
model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])
-
3、训练与评估
-
通过调用model的
compile
方法去配置该模型所需要的训练参数以及评估方法。 -
model.compile(optimizer,loss=None,metrics=None, 准确率衡):配置训练相关参数
- optimizer:梯度下降优化器(在keras.optimizers)
-
from tensorflow.python.keras.optimizers import Adadelta from tensorflow.python.keras.optimizers import Adagrad from tensorflow.python.keras.optimizers import Adam from tensorflow.python.keras.optimizers import Adamax from tensorflow.python.keras.optimizers import Nadam from tensorflow.python.keras.optimizers import Optimizer from tensorflow.python.keras.optimizers import RMSprop from tensorflow.python.keras.optimizers import SGD from tensorflow.python.keras.optimizers import deserialize from tensorflow.python.keras.optimizers import get from tensorflow.python.keras.optimizers import serialize from tensorflow.python.keras.optimizers import AdamOptimizer()
* loss=None:损失类型,类型可以是字符串或者该function名字参考:
```python
from tensorflow.python.keras.losses import KLD
from tensorflow.python.keras.losses import KLD as kld
from tensorflow.python.keras.losses import KLD as kullback_leibler_divergence
from tensorflow.python.keras.losses import MAE
from tensorflow.python.keras.losses import MAE as mae
from tensorflow.python.keras.losses import MAE as mean_absolute_error
from tensorflow.python.keras.losses import MAPE
from tensorflow.python.keras.losses import MAPE as mape
from tensorflow.python.keras.losses import MAPE as mean_absolute_percentage_error
from tensorflow.python.keras.losses import MSE
from tensorflow.python.keras.losses import MSE as mean_squared_error
from tensorflow.python.keras.losses import MSE as mse
from tensorflow.python.keras.losses import MSLE
from tensorflow.python.keras.losses import MSLE as mean_squared_logarithmic_error
from tensorflow.python.keras.losses import MSLE as msle
from tensorflow.python.keras.losses import binary_crossentropy
from tensorflow.python.keras.losses import categorical_crossentropy
from tensorflow.python.keras.losses import categorical_hinge
from tensorflow.python.keras.losses import cosine
from tensorflow.python.keras.losses import cosine as cosine_proximity
from tensorflow.python.keras.losses import deserialize
from tensorflow.python.keras.losses import get
from tensorflow.python.keras.losses import hinge
from tensorflow.python.keras.losses import logcosh
from tensorflow.python.keras.losses import poisson
from tensorflow.python.keras.losses import serialize
from tensorflow.python.keras.losses import sparse_categorical_crossentropy
from tensorflow.python.keras.losses import squared_hinge
* metrics=None, ['accuracy']
-
model.fit():进行训练
-
(x=None,y=None, batch_size=None,epochs=1,callbacks=None)
-
x:特征值:
-
1、Numpy array (or array-like), or a list of arrays
2、A TensorFlow tensor, or a list of tensors
3、tf.data
dataset or a dataset iterator. Should return a tuple of either (inputs, targets)
or (inputs, targets, sample_weights)
.
4、A generator or keras.utils.Sequence
returning (inputs, targets)
or (inputs, targets, sample weights)
.
```
* y:目标值
* batch_size=None:批次大小
* epochs=1:训练迭代次数
* callbacks=None:添加回调列表(用于如tensorboard显示等)
model.compile(optimizer=tf.keras.optimizers.Adam(),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(train_images, train_labels, epochs=5)
model.evaluate(test_images, test_labels)
1.3.2 案例:实现多层神经网络进行时装分类
70000 张灰度图像,涵盖 10 个类别。以下图像显示了单件服饰在较低分辨率(28x28 像素)下的效果:
1.3.2.1 需求:
标签 | 类别 |
---|---|
T 恤衫/上衣 | |
1 | 裤子 |
2 | 套衫 |
3 | 裙子 |
4 | 外套 |
5 | 凉鞋 |
6 | 衬衫 |
7 | 运动鞋 |
8 | 包包 |
1.3.2.2 步骤分析和代码实现:
-
读取数据集:
- 从datasets中获取相应的数据集,直接有训练集和测试集
class SingleNN(object):
def __init__(self):
(self.train, self.train_label), (self.test, self.test_label) = keras.datasets.fashion_mnist.load_data()
-
进行模型编写
- 双层:128个神经元,全连接层10个类别输出
class SingleNN(object):
model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])
这里我们model只是放在类中,作为类的固定模型属性
激活函数的选择
涉及到网络的优化时候,会有不同的激活函数选择有一个问题是神经网络的隐藏层和输出单元用什么激活函数。之前我们都是选用 sigmoid 函数,但有时其他函数的效果会好得多,大多数通过实践得来,没有很好的解释性。
可供选用的激活函数有:
- tanh 函数(the hyperbolic tangent function,双曲正切函数):
效果比 sigmoid 函数好,因为函数输出介于 -1 和 1 之间。
注 :tanh 函数存在和 sigmoid 函数一样的缺点:当 z 趋紧无穷大(或无穷小),导数的梯度(即函数的斜率)就趋紧于 0,这使得梯度算法的速度会减慢。
- ReLU 函数(the rectified linear unit,修正线性单元)
当 z > 0 时,梯度始终为 1,从而提高神经网络基于梯度算法的运算速度,收敛速度远大于 sigmoid 和 tanh。然而当 z < 0 时,梯度一直为 0,但是实际的运用中,该缺陷的影响不是很大。
- Leaky ReLU(带泄漏的 ReLU):
Leaky ReLU 保证在 z < 0 的时候,梯度仍然不为 0。理论上来说,Leaky ReLU 有 ReLU 的所有优点,但在实际操作中没有证明总是好于 ReLU,因此不常用。
为什么需要非线性的激活函数
使用线性激活函数和不使用激活函数、直接使用 Logistic 回归没有区别,那么无论神经网络有多少层,输出都是输入的线性组合,与没有隐藏层效果相当,就成了最原始的感知器了。
<span class="katex"><span class="katex-mathml"><math><semantics><mrow><msup><mi>a</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mo>=</mo><msup><mi>z</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mo>=</mo><msup><mi>W</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mi>x</mi><mo>+</mo><msup><mi>b</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup></mrow><annotation encoding="application/x-tex">a^{[1]} = z^{[1]} = W^{[1]}x+b^{[1]}</annotation></semantics></math></span><span aria-hidden="true" class="katex-html"><span class="strut" style="height:0.8879999999999999em;"></span><span class="strut bottom" style="height:0.9713299999999999em;vertical-align:-0.08333em;"></span><span class="base textstyle uncramped"><span class="mord"><span class="mord mathit">a</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mord"><span class="mord mathit" style="margin-right:0.04398em;">z</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mord mathit">x</span><span class="mbin">+</span><span class="mord"><span class="mord mathit">b</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span></span></span></span>
<span class="katex"><span class="katex-mathml"><math><semantics><mrow><msup><mrow><mi>a</mi></mrow><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>=</mo><msup><mi>z</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>=</mo><msup><mi>W</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><msup><mi>a</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mo>+</mo><msup><mi>b</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup></mrow><annotation encoding="application/x-tex">{a}^{[2]}=z^{[2]} = W^{[2]}a^{[1]}+b^{[2]}</annotation></semantics></math></span><span aria-hidden="true" class="katex-html"><span class="strut" style="height:0.8879999999999999em;"></span><span class="strut bottom" style="height:0.9713299999999999em;vertical-align:-0.08333em;"></span><span class="base textstyle uncramped"><span class="mord"><span class="mord textstyle uncramped"><span class="mord mathit">a</span></span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mord"><span class="mord mathit" style="margin-right:0.04398em;">z</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mord"><span class="mord mathit">a</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mbin">+</span><span class="mord"><span class="mord mathit">b</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span></span></span></span>
那么这样的话相当于
<span class="katex"><span class="katex-mathml"><math><semantics><mrow><msup><mrow><mi>a</mi></mrow><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>=</mo><msup><mi>z</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>=</mo><msup><mi>W</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>(</mo><msup><mi>W</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mi>x</mi><mo>+</mo><msup><mi>b</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mo>)</mo><mo>+</mo><msup><mi>b</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>=</mo><mo>(</mo><msup><mi>W</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><msup><mi>W</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mo>)</mo><mi>x</mi><mo>+</mo><mo>(</mo><msup><mi>W</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><msup><mi>b</mi><mrow><mo>[</mo><mn>1</mn><mo>]</mo></mrow></msup><mo>+</mo><msup><mi>b</mi><mrow><mo>[</mo><mn>2</mn><mo>]</mo></mrow></msup><mo>)</mo><mo>=</mo><mi>w</mi><mi>x</mi><mo>+</mo><mi>b</mi></mrow><annotation encoding="application/x-tex">{a}^{[2]}=z^{[2]} = W^{[2]}(W^{[1]}x+b^{[1]})+b^{[2]}=(W^{[2]}W^{[1]})x+(W^{[2]}b^{[1]}+b^{[2]})=wx+b</annotation></semantics></math></span><span aria-hidden="true" class="katex-html"><span class="strut" style="height:0.8879999999999999em;"></span><span class="strut bottom" style="height:1.138em;vertical-align:-0.25em;"></span><span class="base textstyle uncramped"><span class="mord"><span class="mord textstyle uncramped"><span class="mord mathit">a</span></span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mord"><span class="mord mathit" style="margin-right:0.04398em;">z</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mopen">(</span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mord mathit">x</span><span class="mbin">+</span><span class="mord"><span class="mord mathit">b</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mclose">)</span><span class="mbin">+</span><span class="mord"><span class="mord mathit">b</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mrel">=</span><span class="mopen">(</span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mclose">)</span><span class="mord mathit">x</span><span class="mbin">+</span><span class="mopen">(</span><span class="mord"><span class="mord mathit" style="margin-right:0.13889em;">W</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mord"><span class="mord mathit">b</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">1</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mbin">+</span><span class="mord"><span class="mord mathit">b</span><span class="msupsub"><span class="vlist"><span style="top:-0.363em;margin-right:0.05em;"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span><span class="reset-textstyle scriptstyle uncramped mtight"><span class="mord scriptstyle uncramped mtight"><span class="mopen mtight">[</span><span class="mord mathrm mtight">2</span><span class="mclose mtight">]</span></span></span></span><span class="baseline-fix"><span class="fontsize-ensurer reset-size5 size5"><span style="font-size:0em;"></span></span></span></span></span></span><span class="mclose">)</span><span class="mrel">=</span><span class="mord mathit" style="margin-right:0.02691em;">w</span><span class="mord mathit">x</span><span class="mbin">+</span><span class="mord mathit">b</span></span></span></span>
- 编译、训练以及评估
def compile(self):
SingleNN.model.compile(optimizer=tf.train.AdamOptimizer(),
loss=tf.keras.losses.sparse_categorical_crossentropy,
metrics=['accuracy'])
return None
def fit(self):
SingleNN.model.fit(self.train, self.train_label, epochs=5)
return None
def evaluate(self):
test_loss, test_acc = SingleNN.model.evaluate(self.test, self.test_label)
print(test_loss, test_acc)
return None
1.3.2.1 打印模型
- model.summary():查看模型结构
1.3.2.2 手动保存和回复模型
- 目的:防止训练长时间,出现意外导致重新训练
- model.save_weights('./weights/my_model')
- model.load_weights('./weights/my_model')
SingleNN.model.save_weights("./ckpt/SingleNN")
def predict(self):
# 直接使用训练过后的权重测试
if os.path.exists("./ckpt/checkpoint"):
SingleNN.model.load_weights("./ckpt/SingleNN")
predictions = SingleNN.model.predict(self.test)
print(np.argmax(predictions, 1))
return
1.3.2.3 添加Tensorboard观察损失等情况
# 添加tensoboard观察
tensorboard = keras.callbacks.TensorBoard(log_dir='./graph', histogram_freq=0,
write_graph=True, write_images=True)
SingleNN.model.fit(self.train, self.train_label, epochs=5, callbacks=[tensorboard])