(13) 运行代码Neuron_Network_Entry.py
原创
©著作权归作者所有:来自51CTO博客作者duan_zhihua的原创作品,请联系作者获取转载授权,否则将追究法律责任
接下来,我们从零起步在框架中编码实现Forward Propagation功能,将新增ForwardPropagation.py代码,实现的目录结构代码如图所示:

图
先运行一下上一节Neuron_Network_Entry.py的代码,当隐藏层hidden_layers = [8,4,2]时,运行结果如下:
+1 V1 V2
Hidden layer creation: 1 N[1][1] N[1][2] N[1][3] N[1][4]
N[1][5] N[1][6] N[1][7] N[1][8]
Hidden layer creation: 2 N[2][1] N[2][2] N[2][3] N[2][4]
Hidden layer creation: 3 N[3][1] N[3][2]
Output layer: Output
The weight from 1 at layers[0] to 4 at layers[1] : 0.6068210129454681
The weight from 1 at layers[0] to 5 at layers[1] : -0.2484459388829856
The weight from 1 at layers[0] to 6 at layers[1] : -0.15330978080474755
The weight from 1 at layers[0] to 7 at layers[1] : -0.7043595728834275
The weight from 1 at layers[0] to 8 at layers[1] : 0.2326233856656561
The weight from 1 at layers[0] to 9 at layers[1] : 0.1370947546453336
The weight from 1 at layers[0] to 10 at layers[1] : -0.5678315495000843
The weight from 1 at layers[0] to 11 at layers[1] : 0.4228337375917961
The weight from 2 at layers[0] to 4 at layers[1] : -0.809645549568444
The weight from 2 at layers[0] to 5 at layers[1] : 0.6651350098119597
The weight from 2 at layers[0] to 6 at layers[1] : 0.8866862037359011
The weight from 2 at layers[0] to 7 at layers[1] : 0.8493482916578279
The weight from 2 at layers[0] to 8 at layers[1] : 0.7459138220738961
The weight from 2 at layers[0] to 9 at layers[1] : 0.7119804407110522
The weight from 2 at layers[0] to 10 at layers[1] : 0.8501229219041362
The weight from 2 at layers[0] to 11 at layers[1] : 0.9825287750262468
The weight from 4 at layers[1] to 13 at layers[2] : -0.647536407998637
The weight from 4 at layers[1] to 14 at layers[2] : 0.3939851330747872
The weight from 4 at layers[1] to 15 at layers[2] : -0.4000691912165364
The weight from 4 at layers[1] to 16 at layers[2] : -0.6950769412976536
The weight from 5 at layers[1] to 13 at layers[2] : 0.8792446406675074
The weight from 5 at layers[1] to 14 at layers[2] : 0.7592505893508912
The weight from 5 at layers[1] to 15 at layers[2] : 0.15787377267019576
The weight from 5 at layers[1] to 16 at layers[2] : 0.3780373104391821
The weight from 6 at layers[1] to 13 at layers[2] : -0.14032403924915415
The weight from 6 at layers[1] to 14 at layers[2] : 0.6330830632525586
The weight from 6 at layers[1] to 15 at layers[2] : 0.12793015924536744
The weight from 6 at layers[1] to 16 at layers[2] : -0.45780387142558754
The weight from 7 at layers[1] to 13 at layers[2] : 0.5480459440566925
The weight from 7 at layers[1] to 14 at layers[2] : 0.08382939237546116
The weight from 7 at layers[1] to 15 at layers[2] : -0.13346393259318656
The weight from 7 at layers[1] to 16 at layers[2] : 0.5700789207310142
The weight from 8 at layers[1] to 13 at layers[2] : -0.8102762376848285
The weight from 8 at layers[1] to 14 at layers[2] : 0.4660723020476589
The weight from 8 at layers[1] to 15 at layers[2] : -0.6105943524465972
The weight from 8 at layers[1] to 16 at layers[2] : 0.31966736558911024
The weight from 9 at layers[1] to 13 at layers[2] : 0.021772048040873848
The weight from 9 at layers[1] to 14 at layers[2] : -0.9003503697758317
The weight from 9 at layers[1] to 15 at layers[2] : -0.2974160429419417
The weight from 9 at layers[1] to 16 at layers[2] : 0.5973174686285172
The weight from 10 at layers[1] to 13 at layers[2] : 0.2061131681003876
The weight from 10 at layers[1] to 14 at layers[2] : 0.07692785045228345
The weight from 10 at layers[1] to 15 at layers[2] : 0.8815327149069194
The weight from 10 at layers[1] to 16 at layers[2] : -0.8495544416425534
The weight from 11 at layers[1] to 13 at layers[2] : 0.2504075822452698
The weight from 11 at layers[1] to 14 at layers[2] : -0.94871642839309
The weight from 11 at layers[1] to 15 at layers[2] : -0.07434064139026775
The weight from 11 at layers[1] to 16 at layers[2] : -0.7177310583583724
The weight from 13 at layers[2] to 18 at layers[3] : 0.12958354838226382
The weight from 13 at layers[2] to 19 at layers[3] : 0.48102943398870646
The weight from 14 at layers[2] to 18 at layers[3] : 0.43456703988302636
The weight from 14 at layers[2] to 19 at layers[3] : -0.4813490783095543
The weight from 15 at layers[2] to 18 at layers[3] : -0.033567109898365644
The weight from 15 at layers[2] to 19 at layers[3] : -0.4757310542095684
The weight from 16 at layers[2] to 18 at layers[3] : -0.7923820727831441
The weight from 16 at layers[2] to 19 at layers[3] : -0.5497090254107173
The weight from 18 at layers[3] to 20 at layers[4] : 1.009289190554072
The weight from 19 at layers[3] to 20 at layers[4] : -0.3082092533803199
以上运行结果创建了3个隐藏层,第1个隐藏层有8个神经元, N[i][j]中第1列的索引代表是第i个隐藏层,第2列的索引代表神经元具体的索引j,第2个隐藏层有4个神经元,第3个隐藏层有2个神经元。这个和Tensorfow的可视化图中的第1个隐藏层有8个神经元,第2个隐藏层有4个神经元,第3个隐藏层有2个神经元是完全一样的。 Tensorfow的输入层有2个神经元,而我们这里的输入也是2个神经元,根据instances第一列的数据和第二列的数据得出第3列的数据,Exclusive OR计算只有第一个元素和第二个元素不同的时候,结果才是1,第一个元素和第二个元素相同的时候,结果为0。这个数据是我们输入系统的,但系统并不知道要进行Exclusive OR的计算,我们进行神经网络的训练,就是让系统清楚的预测到第1列、第2列的某种关系,得出第3列到底是0还是1。读者可能认为这里只有4行数据,实际环境中可能有上TB的数据,但是系统的精髓都是一样的,上百万条数据、上亿条数据理解和这个是完全一样的。
这个数据是我们输入系统的,但系统并不知道要进行Exclusive OR的计算,我们进行神经网络的训练,就是让系统清楚的预测到第1列、第2列的某种关系,得出第3列到底是0还是1。读者可能认为这里只有4行数据,实际环境中可能有上TB的数据,但是系统的精髓都是一样的,上百万条数据、上亿条数据理解和这个是完全一样的。根据第1列、第2列的某种关系,得出第3列,开始是训练,训练的时候有误差,然后调整误差,调整误差是Back Propagation的过程,看看预测的值和实际的值是否一致,误差有多大,BackPropagation是从后向前推,看哪些因素导致了误差以及误差的程度,下一次将表现的更好。BackPropagation将在后续章节讲解。这一节讲解Forward Propagation的过程,从输入层的2个Feature出发,到第一个隐藏层的8个神经元,再到第二个隐藏层的4个神经元,再到第三个隐藏层的2个神经元,最后得出输出层的1个神经元,这个过程和Tensorflow的可视化图中的过程是完全一样的。
All the data are the same,所有的数据都是没有区别的。如果能处理0,1,2,3,4,5,6,7,8,9的数字,就能处理图片、音频、视频的数据,它们没有区别。声音、图片的处理也是类似转换为0,1,2,3,4,5,6,7,8,9的数字处理,图片转变是通过CNN处理,CNN的核心是处理图片的时候,将其变成最基本的数据。数据是没有区别的。
接下来编写代码,我们的目标是从输入层Input Layer出发,经过所有的隐藏层Hidden Layers的处理,最终得出输出层Output Layer的结果。我们需迈开第一步,虽然走的是一小步,但这一小步是我们人工智能领域的一大步, 因为我们将完成Forward Propagation的功能。
欢迎关注微信公众号:“从零起步学习人工智能”。
