容器化运行TensorFlow Serving

栏目: 数据库 · 发布时间: 6年前

内容简介:现如今,ModelServer已经开始大行其道。 以前训练了一个模型,还必须得用

现如今,ModelServer已经开始大行其道。 以前训练了一个模型,还必须得用 FlaskFalcon 手写一个Server的时代,快要一去不复返了。

TensorFlow 作为一个流行的AI框架,早已自己解决ModelServer的问题。 并且发展迅速,目前已经成为了可用于生产环境的高性能解决方案。

TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out of the box integration with TensorFlow models, but can be easily extended to serve other types of models.

容器化运行TensorFlow Serving

Dockerized TensorFlow Serving

利用 Docker 运行一个 TensorFlow 的ModelServe,是非常容易的。

docker pull tensorflow/serving:latest
docker pull tensorflow/serving:latest-gpu

按需选择CPU或GPU版本(GPU版本的镜像大愈百倍)。 如果需要指定版本,可以看其 tags

以下列出几个版本(并非全部)及其镜像大小(2018年11月):

TensorFlow Version Tag (Compressed) Size Comment
1.12.0 latest, 1.12.0 67 MB 目前最新CPU版本
1.12.0 latest-gpu, 1.12.0-gpu 1 GB 目前最新GPU版本
1.10.1 1.10.1 62 MB
1.10.1 1.10.1-gpu 919 MB
1.8.0 1.8.0 98 MB
1.8.0 1.8.0-devel-gpu 2 GB
1.6.1 1.6.1 98 MB 官方Docker最低CPU版本
1.6.1 1.6.1-devel-gpu 2 GB 官方Docker最低GPU版本

准备model

训练好的模型,需要利用 SavedModelBuilder ,导出为一个SavedModel。

SavedModelBuilder 导出的官方示例:

export_path_base = sys.argv[-1]
export_path = os.path.join(
      compat.as_bytes(export_path_base),
      compat.as_bytes(str(FLAGS.model_version)))
print 'Exporting trained model to', export_path
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
builder.add_meta_graph_and_variables(
      sess, [tag_constants.SERVING],
      signature_def_map={
           'predict_images':
               prediction_signature,
           signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
               classification_signature,
      },
      legacy_init_op=legacy_init_op)
builder.save()

导出后的SavedModel,应该具有以下目录结构:

assets/
assets.extra/
variables/
    variables.data-?????-of-?????
    variables.index
saved_model.pb

运行服务

首先,确保 dockerdocker-compose 已安装。 docker 的安装略, docker-compose 的安装可以通过 pip 来完成。

sudo -H pip install docker-compose

新建一个目录,里面除了放置Model(这里叫 model0 )以外,再新增一个 docker-compolse.yaml 文件,内容如下:

version: "3"

services:
  serving:
    image: tensorflow/serving:latest
    restart: unless-stopped
    ports:
      - 8500:8500
      - 8501:8501
    volumes:
      - ./model0:/models/MODEL0
    environment:
      - MODEL_NAME=MODEL0

其中, MODLE0 为自定义名称,可按需修改。 但是, MODEL_NAME=*volumes 里的 /models/* ,应该保持一致。

docker-compose.yaml 准备完毕后,在同路径下,执行一行命令即可运行服务。

docker-compose up -d

serve成功的log示例

docker-compose up-d 参数,会让服务在后台运行。 如果想查看log,可以用 docker-compose logs

$ docker-compose logs -f
serving_1  | 2018-11-12 11:45:29.679213: I tensorflow_serving/model_servers/main.cc:157] Building single TensorFlow model file config:  model_name: MODEL0 model_base_path: /models/MODEL0
serving_1  | 2018-11-12 11:45:29.679673: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
serving_1  | 2018-11-12 11:45:29.679700: I tensorflow_serving/model_servers/server_core.cc:517]  (Re-)adding model: MODEL0
serving_1  | 2018-11-12 11:45:29.780031: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: MODEL0 version: 1}
serving_1  | 2018-11-12 11:45:29.780088: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: MODEL0 version: 1}
serving_1  | 2018-11-12 11:45:29.780101: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: MODEL0 version: 1}
serving_1  | 2018-11-12 11:45:29.780127: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /models/MODEL0/1
serving_1  | 2018-11-12 11:45:29.780140: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /models/MODEL0/1
serving_1  | 2018-11-12 11:45:29.847226: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
serving_1  | 2018-11-12 11:45:29.862407: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
serving_1  | 2018-11-12 11:45:29.920827: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:113] Restoring SavedModel bundle.
serving_1  | 2018-11-12 11:45:29.920907: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:123] The specified SavedModel has no variables; no checkpoints were restored.
serving_1  | 2018-11-12 11:45:29.920917: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:148] Running LegacyInitOp on SavedModel bundle.
serving_1  | 2018-11-12 11:45:29.920932: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:233] SavedModel load for tags { serve }; Status: success. Took 140784 microseconds.
serving_1  | 2018-11-12 11:45:29.920969: I tensorflow_serving/servables/tensorflow/saved_model_warmup.cc:83] No warmup data file found at /models/MODEL0/1/assets.extra/tf_serving_warmup_requests
serving_1  | 2018-11-12 11:45:29.921085: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: MODEL0 version: 1}
serving_1  | 2018-11-12 11:45:29.922849: I tensorflow_serving/model_servers/main.cc:327] Running ModelServer at 0.0.0.0:8500 ...
serving_1  | [warn] getaddrinfo: address family for nodename not supported
serving_1  | 2018-11-12 11:45:29.924139: I tensorflow_serving/model_servers/main.cc:337] Exporting HTTP/REST API at:localhost:8501 ...
serving_1  | [evhttp_server.cc : 235] RAW: Entering the event loop ...

如果运行失败,将提示找不到model之类的信息。

以上配置和运行,都是CPU的版本。 如果要使用GPU版本,除驱动和镜像外,还需准备 nvidia-docker 或(更推荐使用) nvidia-docker2

使用

1.8.0 以上的TensorFlow,除了传统的gRPC以外,还支持 RESTful API 。 gRPC在 8500 端口, RESTful API8501 端口。

gRPC通信的形式,可以参考 prediction_service.proto

service PredictionService {
  // Classify.
  rpc Classify(ClassificationRequest) returns (ClassificationResponse);

  // Regress.
  rpc Regress(RegressionRequest) returns (RegressionResponse);

  // Predict -- provides access to loaded TensorFlow model.
  rpc Predict(PredictRequest) returns (PredictResponse);

  // MultiInference API for multi-headed models.
  rpc MultiInference(MultiInferenceRequest) returns (MultiInferenceResponse);

  // GetModelMetadata - provides access to metadata for loaded models.
  rpc GetModelMetadata(GetModelMetadataRequest)
      returns (GetModelMetadataResponse);
}

RESTful API 方式的通信,HTTP报文大致如下:

POST http://host:port/<URI>:<VERB>

URI: /v1/models/${MODEL_NAME}[/versions/${MODEL_VERSION}]
VERB: classify|regress|predict

内容为JSON, classifyregress 的示例如下( predict 略):

{
  // Optional: serving signature to use.
  // If unspecifed default serving signature is used.
  "signature_name": <string>,

  // Optional: Common context shared by all examples.
  // Features that appear here MUST NOT appear in examples (below).
  "context": {
    "<feature_name3>": <value>|<list>
    "<feature_name4>": <value>|<list>
  },

  // List of Example objects
  "examples": [
    {
      // Example 1
      "<feature_name1>": <value>|<list>,
      "<feature_name2>": <value>|<list>,
      ...
    },
    {
      // Example 2
      "<feature_name1>": <value>|<list>,
      "<feature_name2>": <value>|<list>,
      ...
    }
    ...
  ]
}

Request发送成功后,Response大致如下。

classify

{
  "result": [
    // List of class label/score pairs for first Example (in request)
    [ [<label1>, <score1>], [<label2>, <score2>], ... ],

    // List of class label/score pairs for next Example (in request)
    [ [<label1>, <score1>], [<label2>, <score2>], ... ],
    ...
  ]
}

regress

{
  // One regression value for each example in the request in the same order.
  "result": [ <value1>, <value2>, <value3>, ...]
}

参考

以上 TensorFlow 网址,使用的是境内 tensorflow.google.cn 的网址。 最新内容可访问 tensorflow.org ,自备工具。


以上所述就是小编给大家介绍的《容器化运行TensorFlow Serving》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

怪诞行为学2

怪诞行为学2

[美]丹·艾瑞里 / 赵德亮 / 中信出版社 / 2010-1-9 / 42.00元

《怪诞行为学2:非理性的积极力量》编辑推荐:尝试用“非理性”的决策方式,彻底颠覆工作和生活中的“逻辑”,你将获得意想不到的成就感与幸福感!畅销书《怪诞行为学》作者卷土重来,掀起新一轮“非理性”狂潮。 《写给中国人的经济学》作者王福重、著名行为经济学家董志勇倾情作序。 诺贝尔经济学奖得主阿克尔洛夫、《免费》作者安德森高度评价。 《纽约时报》《哈佛商业评论》《波士顿环球报》等全球顶级......一起来看看 《怪诞行为学2》 这本书的介绍吧!

HTML 编码/解码
HTML 编码/解码

HTML 编码/解码

Markdown 在线编辑器
Markdown 在线编辑器

Markdown 在线编辑器

RGB CMYK 转换工具
RGB CMYK 转换工具

RGB CMYK 互转工具