xavier、ortho是神经网络中常用的权重初始化方法，在全连接中这两种权重初始化的方法比较好理解，但是在CNN的卷积网络中的具体实现却不好理解了。

## xavier，kaiming初始化中的fan_in,fan_out在卷积神经网络是什么意思 ​​

fan_in指      kernel_height x kernel_width x in_channel     。

fan_out指    kernel_height x kernel_width x out_channel   。

tf.orthogonal_initializer

def ortho_init(scale=1.0):    def _ortho_init(shape, dtype, partition_info=None):        # lasagne ortho init for tf        shape = tuple(shape)        if len(shape) == 2:            flat_shape = shape        elif len(shape) == 4:  # assumes NHWC            flat_shape = (np.prod(shape[:-1]), shape[-1])        else:            raise NotImplementedError        a = np.random.normal(0.0, 1.0, flat_shape)        u, _, v = np.linalg.svd(a, full_matrices=False)        q = u if u.shape == flat_shape else v  # pick the one with the correct shape        q = q.reshape(shape)        return (scale * q[:shape[0], :shape[1]]).astype(np.float32)    return _ortho_init

elif len(shape) == 4:  # assumes NHWC    flat_shape = (np.prod(shape[:-1]), shape[-1])