【摘要】 Tensorflow Serving 是tf模型持久化的重要工具,本篇介绍如何通过Docker compose搭建并调试TensorFlow Serving

TensorFlow Serving GitHub地址:

https://github.com/tensorflow/serving

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker

建立docker-compose 文件目录

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_02

在serving下建立docker-compose.yml文件。

 Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_03

一、下载安装测试TensorFlow Serving正常运行

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_04

拉取最近版本的docker

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_05

二、用tensorflow训练模型并导出model文件

(https://www.tensorflow.org/guide/saved_model#prepare_serving_inputs)

首先将训练好的模型导出为*.pd的model文件。

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_06

导出model文件后,记录model的存放地址<export path>

模型查看

saved_model_cli show --dir <export path> --all

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_07

三、使用serving

(https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/serving_config.md)

单模型测试及演示POST请求。介绍JSON设计

docker-compose.yml文件示例:

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_08

.env 文件配置

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_09

单模型部署和多模型部署:

models.config文件示例:

Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_10

https://www.tensorflow.org/tfx/tutorials/serving/rest_simple

成功部署多模型多版本tensorflow serving

 Tensorflow Serving Docker compose 部署服务细节(Ubuntu)_Docker_11

请求模板:

{

  "signature_name": "predict",

  "instances": [

    {

      "SepalLength": 

        5.1,

      "PetalLength": 

        1.7,

      "PetalWidth": 

        0.5,

      "SepalWidth": 

        3.3

    }

  ]

}

返回模板:

{

  "predictions": [

    {

      "classes": [(分类名)

        "0"

      ],

      "logits": [

        -3.47067

      ],

      "logistic": [

        0.0301584

      ],

      "class_ids": [(分类ID)

        0

      ],

      "probabilities": [(分类概率)

        0.969842,

        0.0301584

      ]

    }

  ]

}

来源:华为云社区  作者:Edison