一、adb

adb shell,进入板子的根目录
exit,退出到linux
adb pull 板子 linux(从板子到linux)
adb push linux 板子(从linux到板子)

二、测试AI功能

2.1 resnet18

import cv2
import numpy as np
import platform
from synset_label import labels
from rknnlite.api import RKNNLite

# decice tree for RK356x/RK3576/RK3588
DEVICE_COMPATIBLE_NODE = '/proc/device-tree/compatible'

def get_host():
    # get platform and device type
    system = platform.system()
    machine = platform.machine()
    os_machine = system + '-' + machine
    if os_machine == 'Linux-aarch64':
        try:
            with open(DEVICE_COMPATIBLE_NODE) as f:
                device_compatible_str = f.read()
                if 'rk3562' in device_compatible_str:
                    host = 'RK3562'
                elif 'rk3576' in device_compatible_str:
                    host = 'RK3576'
                elif 'rk3588' in device_compatible_str:
                    host = 'RK3588'
                else:
                    host = 'RK3566_RK3568'
        except IOError:
            print('Read device node {} failed.'.format(DEVICE_COMPATIBLE_NODE))
            exit(-1)
    else:
        host = os_machine
    return host

INPUT_SIZE = 224

RK3566_RK3568_RKNN_MODEL = 'resnet18_for_rk3566_rk3568.rknn'
RK3588_RKNN_MODEL = 'resnet18_for_rk3588.rknn'
RK3562_RKNN_MODEL = 'resnet18_for_rk3562.rknn'
RK3576_RKNN_MODEL = 'resnet18_for_rk3576.rknn'


def show_top5(result):
    output = result[0].reshape(-1)
    # Softmax
    output = np.exp(output) / np.sum(np.exp(output))
    # Get the indices of the top 5 largest values
    output_sorted_indices = np.argsort(output)[::-1][:5]
    top5_str = 'resnet18\n-----TOP 5-----\n'
    for i, index in enumerate(output_sorted_indices):
        value = output[index]
        if value > 0:
            topi = '[{:>3d}] score:{:.6f} class:"{}"\n'.format(index, value, labels[index])
        else:
            topi = '-1: 0.0\n'
        top5_str += topi
    print(top5_str)


if __name__ == '__main__':

    # Get device information
    host_name = get_host()
    if host_name == 'RK3566_RK3568':
        rknn_model = RK3566_RK3568_RKNN_MODEL
    elif host_name == 'RK3562':
        rknn_model = RK3562_RKNN_MODEL
    elif host_name == 'RK3576':
        rknn_model = RK3576_RKNN_MODEL
    elif host_name == 'RK3588':
        rknn_model = RK3588_RKNN_MODEL
    else:
        print("This demo cannot run on the current platform: {}".format(host_name))
        exit(-1)

    rknn_lite = RKNNLite()

    # Load RKNN model
    print('--> Load RKNN model')
    ret = rknn_lite.load_rknn(rknn_model)
    if ret != 0:
        print('Load RKNN model failed')
        exit(ret)
    print('done')

    ori_img = cv2.imread('./space_shuttle_224.jpg')
    img = cv2.cvtColor(ori_img, cv2.COLOR_BGR2RGB)
    img = np.expand_dims(img, 0)

    # Init runtime environment
    print('--> Init runtime environment')
    # Run on RK356x / RK3576 / RK3588 with Debian OS, do not need specify target.
    if host_name in ['RK3576', 'RK3588']:
        # For RK3576 / RK3588, specify which NPU core the model runs on through the core_mask parameter.
        ret = rknn_lite.init_runtime(core_mask=RKNNLite.NPU_CORE_0)
    else:
        ret = rknn_lite.init_runtime()
    if ret != 0:
        print('Init runtime environment failed')
        exit(ret)
    print('done')

    # Inference
    print('--> Running model')
    outputs = rknn_lite.inference(inputs=[img])

    # Show the classification results
    show_top5(outputs)
    print('done')

    rknn_lite.release()

正点原子RK3588(一)——开机测试+AI初探_开发语言

2.2 对于安装

1.安装anacoda
2. 安装adb
3. 安装交叉编译工具链
4. 安装RKNN Toolkit Lite2
5. 更新NPU驱动(待)
6. 查看和调整CPU与NPU频率(待)
7. 安装rknn-toolkit2(待)
8. 安装conda(待)

RKNN Toolkit Lite2和rknn-toolkit2
rknn-toolkit2 的出现解决了这一难题。它提供了一个完整的解决方案,允许开发者将已经训练好的深度学习模型(如传统的TensorFlow、Pytorch、onnx 等格式的模型)转换为优化的rknn 格式,这些格式的模型能够充分利用RK3568、RK3588 等设备的硬件特性,从而在保证模型性能的同时,降低了对计算资源的需求。(优化,变成RKNN)

RKNN Toolkit Lite2 是一个轻量级的工具,专门为带有Rockchip NPU平台提供Python编程接口,帮助用户部署使用RKNN-Toolkit2导出的RKNN模型。它更侧重于在开发板上的部署和推理,通常用于实际的硬件环境中。RKNN Toolkit Lite2的开发文档位于 external/rknn/rknn-toolkit2/rknn-toolkit-lite2/doc 目录下,并且在Toybrick Debian11系统中已经预装了RKNN Toolkit Lite2的whl包 。

rknn-toolkit2 是一个更全面的开发套件,它不仅包括了Lite2的功能,还提供了更多的工具和接口,用于在PC平台上进行模型转换、推理和性能评估。它支持的功能更多,包括模型转换、量化、性能和内存评估等。rknn-toolkit2的开发文档位于 external/rknn/rknn-toolkit2/doc 目录下 。