举例

比如某一次的输出结果如下

训练log中各参数的意义

 5: 10.222071, 10.294983 avg loss, 0.000000 rate, 395.829699 seconds, 320 images

Loaded: 0.000000 seconds

Region Avg IOU: 0.227881, Class: 1.000000, Obj: 0.381839, No Obj: 0.465026, Avg Recall: 0.000000,  count: 16

Region Avg IOU: 0.203516, Class: 1.000000, Obj: 0.494320, No Obj: 0.466775, Avg Recall: 0.000000,  count: 8

Region Avg IOU: 0.313356, Class: 1.000000, Obj: 0.474162, No Obj: 0.466196, Avg Recall: 0.307692,  count: 13

Region Avg IOU: 0.267079, Class: 1.000000, Obj: 0.500940, No Obj: 0.466384, Avg Recall: 0.250000,  count: 8

Region Avg IOU: 0.419788, Class: 1.000000, Obj: 0.423686, No Obj: 0.466252, Avg Recall: 0.428571,  count: 7

Region Avg IOU: 0.361871, Class: 1.000000, Obj: 0.427209, No Obj: 0.466280, Avg Recall: 0.200000,  count: 10

Region Avg IOU: 0.365572, Class: 1.000000, Obj: 0.430385, No Obj: 0.466651, Avg Recall: 0.125000,  count: 8

Region Avg IOU: 0.342102, Class: 1.000000, Obj: 0.417325, No Obj: 0.466240, Avg Recall: 0.222222,  count: 9

参考源码

region_layer.c ==> forward_region_layer()函数最后一行:

printf("Region Avg IOU: %f, Class: %f,           Obj: %f,       No Obj: %f,                       Avg Recall: %f,  count: %d\n", 

         avg_iou/count,   avg_cat/class_count, avg_obj/count, avg_anyobj/(l.w*l.h*l.n*l.batch), recall/count,    count);

解释

5 -- batch iteration, 表次训练了多少个batch次,或者说batch -> batch -> batch ....这样共训练了多少个轮回

10.222071 -total loss

10.294983 - average loss, 平均损失率,最终结构的评估参数,越小越好,官网给出例子是


  • 9002​ - iteration number (number of batch)
  • 0.060730 avg​ - average loss (error) - ​the lower, the better(差不达到这个点,或不再下降,就可以停下来了)

Region Avg IOU - 平均的IOU,是预测的bounding box和ground truth的交集与并集之比,期望该值趋近于1。

机器学习备注:Yolo训练时输出参数的解释_MachineLearning

Class: 标注物体的概率(找到了多少个类/共有多少个类),期望该值趋近于1(俺这只有一个类,是1)

Obj -- 表示把正本判断为正本得到的平均confidence,该期望该值趋近于1.

No Obj -- 表示总confidence/总box数,期望该值越来越小但不为零.

Avg Recall:找出来的正本数 与 总的正本数之比值,期望该值趋近于1

count: 总共找到的正本数