Integrating TF Serving with TF Adapter

The following uses installation user HwHiAiUser as an example to describe how to integrate TF Adapter into TF Serving. Replace it with the actual username. Ensure that the installation user has the read, or read and write permissions on the paths described in this document.

  1. Download the TF Serving source code.

    The version of TF Serving must be the same as that of TensorFlow. Upload the source package to any directory on the server.

  2. Go to the directory where the source package is stored and run the following commands to decompress and access the TF Serving source package:
    1
    2
    unzip 1.15.0.zip
    cd serving-1.15.0/
    
  3. Add third-party dependency packages of TF Serving.
    1. Create the tf_adapter folder in the serving-1.15.0/third_party directory, and go to the folder.
      1
      2
      3
      cd third_party/
      mkdir tf_adapter
      cd tf_adapter
      
    2. Copy the libpython3.7m.so.1.0 file to the tf_adapter folder, and create a soft link.
      1
      2
      cp /usr/local/python3.7.5/lib/libpython3.7m.so.1.0 .
      ln -s libpython3.7m.so.1.0 libpython3.7m.so
      
    3. Copy the _tf_adapter.so file to the tf_adapter folder, and change the file name from _tf_adapter.so to lib_tf_adapter.so.
      1
      2
      cp ${TFPLUGIN_INSTALL_PATH}/npu_bridge/_tf_adapter.so .
      mv _tf_adapter.so lib_tf_adapter.so
      

      ${TFPLUGIN_INSTALL_PATH} is the installation path of the TF Adapter package.

  4. Compile empty libtensorflow_framework.so and _pywrap_tensorflow_internal.so files.
    1. Run the following command in the tf_adapter folder:
      1
      vim CMakeLists.txt
      
    2. Add the following content, and save the file.
      file(TOUCH ${CMAKE_CURRENT_BINARY_DIR}/stub.c)
      add_library(_pywrap_tensorflow_internal SHARED ${CMAKE_CURRENT_BINARY_DIR}/stub.c)
      add_library(tensorflow_framework SHARED ${CMAKE_CURRENT_BINARY_DIR}/stub.c)
    3. Run the :wq! command to save the file and exit.
    4. Compile empty .so files.
      1
      2
      3
      4
      5
      6
      7
      8
      mkdir temp
      cd temp
      cmake ..
      make
      mv lib_pywrap_tensorflow_internal.so ../_pywrap_tensorflow_internal.so
      mv libtensorflow_framework.so ../libtensorflow_framework.so
      cd ..
      ln -s libtensorflow_framework.so libtensorflow_framework.so.1
      
    5. Configure the environment variable.
      1
      export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$(pwd)
      
  5. Create the compilation file, and add the following content:
    1. Run the following commands to create a BUILD file in the tf_adapter folder:
      vim BUILD
    2. Add the following content:
      licenses(["notice"])  # BSD/MIT.
      
      cc_import(
          name = "tf_adapter",
          shared_library = "lib_tf_adapter.so",
          visibility = ["//visibility:public"] 
      )
      
      cc_import(
          name = "tf_python",
          shared_library = "libpython3.7m.so",
          visibility = ["//visibility:public"]
      )
    3. Run the :wq! command to save the file and exit.
  6. Add the following content in bold to cc_binary in the BUILD file under the serving-1.15.0/tensorflow_serving/model_servers/ directory:
    cc_binary(
        name = "tensorflow_model_server",
        stamp = 1,
        visibility = [
            ":testing",
            "//tensorflow_serving:internal",
        ],
        deps = [
            ":tensorflow_model_server_main_lib",
            "//third_party/tf_adapter:tf_adapter",
            "//third_party/tf_adapter:tf_python",
            "@org_tensorflow//tensorflow/compiler/jit:xla_cpu_jit",
        ],
    )
  7. Perform TF Serving compilation.

    Run the following command in the TF Serving installation directory serving-1.15.0 to compile TF Serving:

    1
    bazel --output_user_root=/opt/tf_serving build -c opt --cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" tensorflow_serving/model_servers:tensorflow_model_server
    

    The –output_user_root option specifies the installation path of TF Serving. Set it based on the actual situation.

  8. Create a soft link.

    The command used for creating a soft link is as follows:

    1
    ln -s /opt/tf_serving/{tf_serving_ID}/execroot/tf_serving/bazel-out/xxx-opt/bin/tensorflow_serving/model_servers/tensorflow_model_server /usr/local/bin/tensorflow_model_server
    
    • {tf_serving_ID} is a string of irregular characters, for example, 063944eceea3e72745362a0b6eb12a3c. Set it based on the actual situation.
    • The xxx-opt folder is automatically generated by the tool. Replace it with the actual folder name.